Quantum Public Key Encryption for NISQ Devices
Nishant Rodrigues, Walter O. Krawec, Brad Lackey
et al.
Quantum public-key encryption (PKE), where public-keys and/or ciphertexts can be quantum states, is an important primitive in quantum cryptography. Unlike classical PKE (e.g., RSA or ECC), quantum PKE can leverage quantum-secure cryptographic assumptions or the principles of quantum mechanics for security. It has great potential in providing for secure cryptographic systems under potentially weaker assumptions than is possible classically. In addition, it is of both practical and theoretical interest, opening the door to novel cryptographic systems not possible with classical information alone. While multiple quantum PKE schemes have been proposed, they require a large number of qubits acting coherently, and are not practical on current noisy quantum devices. In this paper, we design a practical quantum PKE scheme, taking into account the constraints of current NISQ devices. Specifically, we design a PKE scheme with quantum-classical public keys and classical ciphertexts, that is noise-resilient and only requires a small number of qubits acting coherently. In addition, our design provides tradeoffs in terms of efficiency and the number of qubits that are required.
Scaling Law for Time Series Forecasting
Jingzhe Shi, Qinwei Ma, Huan Ma
et al.
Scaling law that rewards large datasets, complex models and enhanced data granularity has been observed in various fields of deep learning. Yet, studies on time series forecasting have cast doubt on scaling behaviors of deep learning methods for time series forecasting: while more training data improves performance, more capable models do not always outperform less capable models, and longer input horizons may hurt performance for some models. We propose a theory for scaling law for time series forecasting that can explain these seemingly abnormal behaviors. We take into account the impact of dataset size and model complexity, as well as time series data granularity, particularly focusing on the look-back horizon, an aspect that has been unexplored in previous theories. Furthermore, we empirically evaluate various models using a diverse set of time series forecasting datasets, which (1) verifies the validity of scaling law on dataset size and model complexity within the realm of time series forecasting, and (2) validates our theoretical framework, particularly regarding the influence of look back horizon. We hope our findings may inspire new models targeting time series forecasting datasets of limited size, as well as large foundational datasets and models for time series forecasting in future work. Code for our experiments has been made public at https://github.com/JingzheShi/ScalingLawForTimeSeriesForecasting.
On the theory of earthquakes: Paradoxical contradiction of Omori's law to the law of energy conservation
A. V. Guglielmi, B. I. Klain
After the main shock of an earthquake the aftershocks are observed. According to Omori's law, the frequency of aftershocks decreases hyperbolically over time. We noticed that, strictly speaking, Omori's law paradoxically contradicts the law of energy conservation. The contradiction is that the excitation of each aftershock consumes a finite portion of the source's energy, so that the total energy released by the source tends to infinity over time. The paradox is formally theoretical, but its analysis has proved useful. Eliminating the contradiction between Omori's law and the fundamental law of conservation of energy allowed us to further understand the nature of the phenomenological theory of aftershocks. We used the concept of deactivation of a source after the formation of a main rupture in it. We have based the theory on the original aftershock evolution equation, which has the form of a first-order linear differential equation. Two ways to eliminate the paradoxical situation are indicated. Key words: earthquake source, aftershock, evolution equation, deactivation coefficient, inverse problem, Omori epoch, source bifurcation, logistic equation, Hirano-Utsu formula.
Analysis of public transport (in)accessibility and land-use pattern in different areas in Singapore
Hoai Nguyen Huynh
As more and more people continue to live in highly urbanised areas across the globe, reliable accessibility to amenities and services plays a vital role in sustainable development. One of the challenges in addressing this issue is the consistent and equal provision of public services, including transport for residents across the urban system. In this study, using a novel computational method combining geometrical analysis and information-theoretic measures, we analyse the accessibility to public transport in terms of the spatial coverage of the transport nodes (stops) and the quality of service at these nodes across different areas. Furthermore, using a network clustering procedure, we also characterise the land-use pattern of those areas and relate that to their public transport accessibility. Using Singapore as a case study, we find that the commercial areas in the CBD area expectedly have excellent accessibility and the residential areas also have good to very good accessibility. However, not every residential area is equally accessible. While the spatial coverage of stops in these areas is very good, the quality of service indicates substantial variation among different regions, with high contrast between the central and eastern areas compared to the others in the west and north of the city-state. We believe this kind of analysis could yield a good understanding of the current level of public transport services across the urban system, and their disparity will provide valuable and actionable insights into the future development plans.
Programming Languages and Law: A Research Agenda
James Grimmelmann
If code is law, then the language of law is a programming language. Lawyers and legal scholars can learn about law by studying programming-language theory, and programming-language tools can be usefully applied to legal problems. This article surveys the history of research on programming languages and law and presents ten promising avenues for future efforts. Its goals are to explain how the combination of programming languages and law is distinctive within the broader field of computer science and law, and to demonstrate with concrete examples the remarkable power of programming-language concepts in this new domain.
A Data-Driven Technique Using Millisecond Transients to Measure the Milky Way Halo
E. Platts, J. Xavier Prochaska, Casey J. Law
We introduce a new technique to constrain the line-of-sight integrated electron density of our Galactic halo $\text{DM}_\text{MW,halo}$ through analysis of the observed dispersion measure distributions of pulsars $\text{DM}_\text{pulsar}$ and fast radio bursts $\text{DM}_\text{FRB}$. We model these distributions, correcting for the Galactic interstellar medium, with kernel density estimation---well-suited to the small data regime---to find lower/upper bounds to the corrected $\text{DM}_\text{pulsar}$/$\text{DM}_\text{FRB}$ distributions: $\max[\text{DM}_\text{pulsar}] \approx 7\pm2 \text{ (stat)} \pm 9 \text{ (sys) pc cm}^{-3}$ and $\min[\text{DM}_\text{FRB}] \approx 63^{+27}_{-21} \text{ (stat)} \pm 9 \text{ (sys) pc cm}^{-3}$. Using bootstrap resampling to estimate uncertainties, we set conservative limits on the Galactic halo dispersion measure $-2 < \text{DM}_\text{MW,halo} < 123 \text{pc cm}^{-3}$ (95\% c.l.). The upper limit is especially conservative because it may include a non-negligible contribution from the FRB host galaxies and a non-zero contribution from the cosmic web. It strongly disfavors models where the Galaxy has retained the majority of its baryons with a density profile tracking the presumed dark matter density profile. Last, we perform Monte Carlo simulations of larger FRB samples to validate our technique and assess the sensitivity of ongoing and future surveys. We recover bounds of several tens $\text{pc cm}^{-3}$ which may be sufficient to test whether the Galaxy has retained a majority of its baryonic mass. We estimate that a sample of several thousand FRBs will significantly tighten constraints on $\text{DM}_\text{MW,halo}$ and offer a valuable complement to other analyses.
en
astro-ph.GA, astro-ph.HE
Unbiased Estimation of the Gradient of the Log-Likelihood in Inverse Problems
Ajay Jasra, Kody J. H. Law, Deng Lu
We consider the problem of estimating a parameter associated to a Bayesian inverse problem. Treating the unknown initial condition as a nuisance parameter, typically one must resort to a numerical approximation of gradient of the log-likelihood and also adopt a discretization of the problem in space and/or time. We develop a new methodology to unbiasedly estimate the gradient of the log-likelihood with respect to the unknown parameter, i.e. the expectation of the estimate has no discretization bias. Such a property is not only useful for estimation in terms of the original stochastic model of interest, but can be used in stochastic gradient algorithms which benefit from unbiased estimates. Under appropriate assumptions, we prove that our estimator is not only unbiased but of finite variance. In addition, when implemented on a single processor, we show that the cost to achieve a given level of error is comparable to multilevel Monte Carlo methods, both practically and theoretically. However, the new algorithm provides the possibility for parallel computation on arbitrarily many processors without any loss of efficiency, asymptotically. In practice, this means any precision can be achieved in a fixed, finite constant time, provided that enough processors are available.
Global Public Health Surveillance using Media Reports: Redesigning GPHIN
Dave Carter, Marta Stojanovic, Philip Hachey
et al.
Global public health surveillance relies on reporting structures and transmission of trustworthy health reports. But in practice, these processes may not always be fast enough, or are hindered by procedural, technical, or political barriers. GPHIN, the Global Public Health Intelligence Network, was designed in the late 1990s to scour mainstream news for health events, as that travels faster and more freely. This paper outlines the next generation of GPHIN, which went live in 2017, and reports on design decisions underpinning its new functions and innovations.
WatchOut: A Road Safety Extension for Pedestrians on a Public Windshield Display
Matthias Geiger, Changkun Ou, Cedric Quintes
We conducted a field study to investigate whether public windshield displays are applicable as an additional interactive digital road safety warning sign. We focused on investigating the acceptance and usability of our novel public windshield display and its potential use for future applications. The study has shown that users are open-minded to the idea of an extraverted windshield display regardless the use case, whether it is used for safety purposes or different content. Contrary to our hypothesis most people assumed they would mistrust the system if it were as well established as traffic lights and primarily rely on their own perception.
Quantum radiation from a shaken two-level atom in vacuum
Lezhi Lo, C. K. Law
We present a non-relativistic theory of quantum radiation generated by shaking a two-level atom in vacuum. Such radiation has the same origin of photon emission in dynamical Casimir effect. By performing a time-dependent "dressing" transformation to the Hamiltonian, we derive an interaction term that governs the radiation. In particular, we show that photon pairs can be generated, not only by shaking the position of the atom, but also by changing the internal states of the atom. As applications of our theory, we calculate the emission rate from an oscillating atom, and the multi-photon state generated in a single-photon scattering process.
Interactions mediated by a public good transiently increase cooperativity in growing Pseudomonas putida metapopulations
Felix Becker, Karl Wienand, Matthias Lechner
et al.
Bacterial communities have rich social lives. A well-established interaction involves the exchange of a public good in Pseudomonas populations, where the iron-scavenging compound pyoverdine, synthesized by some cells, is shared with the rest. Pyoverdine thus mediates interactions between producers and non-producers and can constitute a public good. This interaction is often used to test game theoretical predictions on the "social dilemma" of producers. Such an approach, however, underestimates the impact of specific properties of the public good, for example consequences of its accumulation in the environment. Here, we experimentally quantify costs and benefits of pyoverdine production in a specific environment, and build a model of population dynamics that explicitly accounts for the changing significance of accumulating pyoverdine as chemical mediator of social interactions. The model predicts that, in an ensemble of growing populations (metapopulation) with different initial producer fractions (and consequently pyoverdine contents), the global producer fraction initially increases. Because the benefit of pyoverdine declines at saturating concentrations, the increase need only be transient. Confirmed by experiments on metapopulations, our results show how a changing benefit of a public good can shape social interactions in a bacterial population.
Metcalfe's Law Revisited
Dmitri Nosovicki
Rudimentary mathematical analysis of simple network models suggests bandwidth-independent saturation of network growth dynamics and hints at linear decrease in information density of the data. However it strongly confirms Metcalfe's law as a measure of network utility and suggests it can play an important role in network calculations. This paper establishes mathematical notion of network value and analyses two conflicting models of network. One, traditional model, fails to manifest Metcalfe's law. Another model, one that observes network in a wider context, both confirms Metcalfe's law and reveals its upper boundary.
The second laws of quantum thermodynamics
Fernando G. S. L. Brandao, Michał Horodecki, Nelly Huei Ying Ng
et al.
The second law of thermodynamics tells us which state transformations are so statistically unlikely that they are effectively forbidden. Its original formulation, due to Clausius, states that "Heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time". The second law applies to systems composed of many particles interacting; however, we are seeing that one can make sense of thermodynamics in the regime where we only have a small number of particles interacting with a heat bath. Is there a second law of thermodynamics in this regime? Here, we find that for processes which are cyclic or very close to cyclic, the second law for microscopic systems takes on a very different form than it does at the macroscopic scale, imposing not just one constraint on what state transformations are possible, but an entire family of constraints. In particular, we find a family of free energies which generalise the traditional one, and show that they can never increase. We further find that there are three regimes which determine which family of second laws govern state transitions, depending on how cyclic the process is. In one regime one can cause an apparent violation of the usual second law, through a process of embezzling work from a large system which remains arbitrarily close to its original state. These second laws are not only relevant for small systems, but also apply to individual macroscopic systems interacting via long-range interactions, which only satisfy the ordinary second law on average. By making precise the definition of thermal operations, the laws of thermodynamics take on a simple form with the first law defining the class of thermal operations, the zeroeth law emerging as a unique condition ensuring the theory is nontrivial, and the remaining laws being a monotonicity property of our generalised free energies.
en
quant-ph, cond-mat.stat-mech
Adaptive and bounded investment returns promote cooperation in spatial public goods games
Xiaojie Chen, Yongkui Liu, Yonghui Zhou
et al.
The public goods game is one of the most famous models for studying the evolution of cooperation in sizable groups. The multiplication factor in this game can characterize the investment return from the public good, which may be variable depending on the interactive environment in realistic situations. Instead of using the same universal value, here we consider that the multiplication factor in each group is updated based on the differences between the local and global interactive environments in the spatial public goods game, but meanwhile limited to within a certain range. We find that the adaptive and bounded investment returns can significantly promote cooperation. In particular, full cooperation can be achieved for high feedback strength when appropriate limitation is set for the investment return. Also, we show that the fraction of cooperators in the whole population can become larger if the lower and upper limits of the multiplication factor are increased. Furthermore, in comparison to the traditionally spatial public goods game where the multiplication factor in each group is identical and fixed, we find that cooperation can be better promoted if the multiplication factor is constrained to adjust between one and the group size in our model. Our results highlight the importance of the locally adaptive and bounded investment returns for the emergence and dominance of cooperative behavior in structured populations.
en
physics.soc-ph, q-bio.PE
Conditional strategies and the evolution of cooperation in spatial public goods games
Attila Szolnoki, Matjaz Perc
The fact that individuals will most likely behave differently in different situations begets the introduction of conditional strategies. Inspired by this, we study the evolution of cooperation in the spatial public goods game, where besides unconditional cooperators and defectors, also different types of conditional cooperators compete for space. Conditional cooperators will contribute to the public good only if other players within the group are likely to cooperate as well, but will withhold their contribution otherwise. Depending on the number of other cooperators that are required to elicit cooperation of a conditional cooperator, the latter can be classified in as many types as there are players within each group. We find that the most cautious cooperators, such that require all other players within a group to be conditional cooperators, are the undisputed victors of the evolutionary process, even at very low synergy factors. We show that the remarkable promotion of cooperation is due primarily to the spontaneous emergence of quarantining of defectors, which become surrounded by conditional cooperators and are forced into isolated convex "bubbles" from where they are unable to exploit the public good. This phenomenon can be observed only in structured populations, thus adding to the relevance of pattern formation for the successful evolution of cooperation.
A Fréchet law and an Erdös-Philipp law for maximal cuspidal windings
Johannes Jaerisch, Marc Kesseböhmer, Bernd O. Stratmann
In this paper we establish a Fréchet law for maximal cuspidal windings of the geodesic flow on a Riemannian surface associated with an arbitrary finitely generated, essentially free Fuchsian group with parabolic elements. This result extends previous work by Galambos and Dolgopyat and is obtained by applying Extreme Value Theory. Subsequently, we show that this law gives rise to an Erdös-Philipp law and to various generalised Khintchine-type results for maximal cuspidal windings. These results strengthen previous results by Sullivan, Stratmann and Velani for Kleinian groups, and extend earlier work by Philipp on continued fractions, which was inspired by a conjecture of Erdös.
Inside Trading, Public Disclosure and Imperfect Competition
Fuzhou Gong, Hong Liu
In this paper, we present a multi-period trading model in the style of Kyle (1985)'s inside trading model, by assuming that there are at least two insiders in the market with long-lived private information, under the requirement that each insider publicly discloses his stock trades after the fact. Based on this model, we study the influences of "public disclosure" and "competition among insiders" on the trading behaviors of insiders. We find that the "competition among insiders" leads to higher effective price and lower insiders' profits, and the "public disclosure" makes each insider play a mixed strategy in every round except the last one. An interesting find is that as the total number of auctions goes to infinity, the market depth and the trading intensity at the first auction are all constants with the requirement of "public disclosure", while the market depth at the first auction goes to zero and the trading intensity of the first period goes to infinity without the requirement of "public disclosure".Moreover, we give the exact speed of the revelation of the private information, and show that all information is revealed immediately and the market depth goes to infinity immediately as trading happens infinitely frequently.
Statistical test of Duane-Hunt's law and its comparison with an alternative law
Milan Perkovac
Using Pearson correlation coefficient a statistical analysis of Duane-Hunt and Kulenkampff's measurement results was performed. This analysis reveals that empirically based Duane-Hunt's law is not entirely consistent with the measurement data. The author has theoretically found the action of electromagnetic oscillators, which corresponds to Planck's constant, and also has found an alternative law based on the classical theory. Using the same statistical method, this alternative law is likewise tested, and it is proved that the alternative law is completely in accordance with the measurements. The alternative law gives a relativistic expression for the energy of electromagnetic wave emitted or absorbed by atoms and proves that the empirically derived Planck-Einstein's expression is only valid for relatively low frequencies. Wave equation, which is similar to the Schrödinger equation, and wavelength of the standing electromagnetic wave are also established by the author's analysis. For a relatively low energy this wavelength becomes equal to the de Broglie wavelength. Without any quantum conditions, the author made a formula similar to the Rydberg's formula, which can be applied to the all known atoms, neutrons and some hyperons.
The second law of blackhole dynamics
Koustubh Ajit Kabe
In this paper, the non-generalized or restricted second law blackhole dynamics as given by Bekenstein in the beginning is restated, with a rigid proof, in a different form akin to the statement of the second law of thermodynamics given by Clausius. The various physical possibilities and implications of this statement are discussed therein. This paper is a mere venture into the restricted second law of blackhole dynamics pertaining to blackholes emitting Hawking radiation. The paper thus considers a didactically interesting reformulation of the second law of blackhole thermodynamics after some revisions.
Power law relaxation in a complex system: Omori law after a financial market crash
Fabrizio Lillo, Rosario N. Mantegna
We study the relaxation dynamics of a financial market just after the occurrence of a crash by investigating the number of times the absolute value of an index return is exceeding a given threshold value. We show that the empirical observation of a power law evolution of the number of events exceeding the selected threshold (a behavior known as the Omori law in geophysics) is consistent with the simultaneous occurrence of (i) a return probability density function characterized by a power law asymptotic behavior and (ii) a power law relaxation decay of its typical scale. Our empirical observation cannot be explained within the framework of simple and widespread stochastic volatility models.
en
cond-mat.stat-mech, q-fin.ST