Hasil untuk "Information theory"

Menampilkan 20 dari ~5379705 hasil · dari arXiv, CrossRef

JSON API
arXiv Open Access 2026
Nonlinear Information Theory: Characterizing Distributional Uncertainty in Communication Models with Sublinear Expectation

Wen-Xuan Lang, Shaoshi Yang, Jianhua Zhang et al.

A mathematical framework for information-theoretic analysis is established, with a new viewpoint of describing transmitted messages and communication channels by the nonlinear expectation theory, beyond the framework of classical probability theory. The major motivation of this research is to emphasize the probabilistic distribution uncertainty within the ever increasingly complex communication networks, where random phenomena are often nonstationary, heterogeneous, and cannot be characterized by a single probability distribution. Based on the nonlinear expectation theory, in this paper we first explicitly define several fundamental concepts, such as nonlinear information entropy, nonlinear joint entropy, nonlinear conditional entropy and nonlinear mutual information, and establish their basic properties. Secondly, by using the strong law of large numbers under sublinear expectations, we propose a nonlinear source coding theorem, which shows that the nonlinear information entropy is the upper bound of the achievable coding rate of sources whose distributions are uncertain under the maximum error probability criterion, and determines a cluster point of the coding rate of such sources under the minimum error probability criterion. Thirdly, we propose a nonlinear channel coding theorem, which gives the explicit expression of the upper bound under the maximum error probability criterion and a cluster point under the minimum error probability criterion, respectively, for the achievable coding rate of communication channels whose distributions are uncertain. Additionally, we propose a nonlinear rate-distortion source coding theorem, proving that the rate distortion function based on the nonlinear mutual information is a cluster point of the lossy compression performance of uncertain-distribution sources under the minimum expected distortion criterion.

en cs.IT, math.PR
arXiv Open Access 2024
Semidefinite optimization of the quantum relative entropy of channels

Gereon Koßmann, Mark M. Wilde

This paper introduces a method for calculating the quantum relative entropy of channels, an essential quantity in quantum channel discrimination and resource theories of quantum channels. By building on recent developments in the optimization of relative entropy for quantum states [Koßmann and Schwonnek, arXiv:2404.17016], we introduce a discretized linearization of the integral representation for the relative entropy of states, enabling us to handle maximization tasks for the relative entropy of channels. Our approach here extends previous work on minimizing relative entropy to the more complicated domain of maximization. It also provides efficiently computable upper and lower bounds that sandwich the true value with any desired precision, leading to a practical method for computing the relative entropy of channels.

en quant-ph, cs.IT
arXiv Open Access 2024
Blind Interference Alignment for MapReduce: Exploiting Side-information with Reconfigurable Antennas

Yuxiang Lu, Syed A. Jafar

In order to explore how blind interference alignment (BIA) schemes may take advantage of side-information in computation tasks, we study the degrees of freedom (DoF) of a $K$ user wireless network setting that arises in full-duplex wireless MapReduce applications. In this setting the receivers are assumed to have reconfigurable antennas and channel knowledge, while the transmitters have neither, i.e., the transmitters lack channel knowledge and are only equipped with conventional antennas. The central ingredient of the problem formulation is the message structure arising out of the Shuffle phase of MapReduce, whereby each transmitter has a subset of messages that need to be delivered to various receivers, and each receiver has a subset of messages available to it in advance as side-information. We approach this problem by decomposing it into distinctive stages that help identify key ingredients of the overall solution. The novel elements that emerge from the first stage, called broadcast with groupcast messages, include an outer maximum distance separable (MDS) code structure at the transmitter, and an algorithm for iteratively determining groupcast-optimal reconfigurable antenna switching patterns at the receiver to achieve intra-message (among the symbols of the same message) alignment. The next stage, called unicast with side-information, reveals optimal inter-message (among symbols of different messages) alignment patterns to exploit side-information, and by a relabeling of messages, connects to the desired MapReduce setting.

arXiv Open Access 2023
Transmission Design for Active RIS-Aided Simultaneous Wireless Information and Power Transfer

Hong Ren, Zhiwei Chen, Guosheng Hu et al.

Reconfigurable intelligent surface (RIS) is a revolutionary technology to enhance both the spectral efficiency and energy efficiency of wireless communication systems. However, most of the existing contributions mainly focused on the study of passive RIS, which suffers from the ``double fading'' effect. On the other hand, active RIS, which is equipped with amplifiers, can effectively address this issue. In this paper, we propose an active RIS-aided simultaneous wireless information and power transfer (SWIPT) system. Specifically, we maximize the weighted sum rate of the information receivers, subject to the minimum power received at all energy receivers, amplification power constraint at the active RIS, and the maximum transmit power constraint at the base station (BS). By adopting alternating optimization framework, suboptimal solutions are obtained. Simulation results show that the active RIS-aided SWIPT system has higher performance gain with the same power budget.

en cs.IT
arXiv Open Access 2021
Match Your Words! A Study of Lexical Matching in Neural Information Retrieval

Thibault Formal, Benjamin Piwowarski, Stéphane Clinchant

Neural Information Retrieval models hold the promise to replace lexical matching models, e.g. BM25, in modern search engines. While their capabilities have fully shone on in-domain datasets like MS MARCO, they have recently been challenged on out-of-domain zero-shot settings (BEIR benchmark), questioning their actual generalization capabilities compared to bag-of-words approaches. Particularly, we wonder if these shortcomings could (partly) be the consequence of the inability of neural IR models to perform lexical matching off-the-shelf. In this work, we propose a measure of discrepancy between the lexical matching performed by any (neural) model and an 'ideal' one. Based on this, we study the behavior of different state-of-the-art neural IR models, focusing on whether they are able to perform lexical matching when it's actually useful, i.e. for important terms. Overall, we show that neural IR models fail to properly generalize term importance on out-of-domain collections or terms almost unseen during training

en cs.IR, cs.CL
arXiv Open Access 2021
Quantum Information in Relativity: the Challenge of QFT Measurements

Charis Anastopoulos, Ntina Savvidou

Proposed quantum experiments in deep space will be able to explore quantum information issues in regimes where relativistic effects are important. In this essay, we argue that a proper extension of Quantum Information theory into the relativistic domain requires the expression of all informational notions in terms of quantum field theoretic (QFT) concepts. This task requires a working and practicable theory of QFT measurements. We present the foundational problems in constructing such a theory, especially in relation to longstanding causality and locality issues in the foundations of QFT. Finally, we present the ongoing Quantum Temporal Probabilities program for constructing a measurement theory that (i) works, in principle, for any QFT, (ii) allows for a first-principles investigation of all relevant issues of causality and locality, and (iii) it can be directly applied to experiments of current interest.

en quant-ph, gr-qc
arXiv Open Access 2021
Optimal Rate-Distortion-Leakage Tradeoff for Single-Server Information Retrieval

Yauhen Yakimenka, Hsuan-Yin Lin, Eirik Rosnes et al.

Private information retrieval protocols guarantee that a user can privately and losslessly retrieve a single file from a database stored across multiple servers. In this work, we propose to simultaneously relax the conditions of perfect retrievability and privacy in order to obtain improved download rates when all files are stored uncoded on a single server. Information leakage is measured in terms of the average success probability for the server of correctly guessing the identity of the desired file. The main findings are: i) The derivation of the optimal tradeoff between download rate, distortion, and information leakage when the file size is infinite. Closed-form expressions of the optimal tradeoff for the special cases of "no-leakage" and "no-privacy" are also given. ii) A novel approach based on linear programming (LP) to construct schemes for a finite file size and an arbitrary number of files. The proposed LP approach can be leveraged to find provably optimal schemes with corresponding closed-form expressions for the rate-distortion-leakage tradeoff when the database contains at most four bits. Finally, for a database that contains 320 bits, we compare two construction methods based on the LP approach with a nonconstructive scheme downloading subsets of files using a finite-length lossy compressor based on random coding.

en cs.IT
CrossRef Open Access 2020
Entropic Integrated Information Theory-Theory of Consciousness

Siddharth Sharma

In this paper I am going to give a mathematical theory of Integrated Information Theory, using entropy as measure of information and hence, as the information distance function. Also, we will consider a set, whose open subsets are mechanisms and topology is the system, we use these two modification in the structure of Integrated Information Theory [2] and Quantum Integrated Information theory [3], to define Entropic Integrated Information Theory, we will also justify our claims to use why entropy should be use as a measure of cause/effect information and as information distance function using [1]. We will also see the relationship of entanglement with concept and conceptual information. This paper is an attempt to binds consciousness, quantum information, entanglement and quantum mechanics together.

arXiv Open Access 2019
Batalin-Vilkovisky formalism in the $p$-adic Dwork theory

Dohyeong Kim, Jeehoon Park, Junyeong Park

The goal of this article is to develop BV (Batalin-Vilkovisky) formalism in the $p$-adic Dwork theory. Based on this formalism, we explicitly construct a $p$-adic dGBV algebra (differential Gerstenhaber-Batalin-Vilkovisky algebra) for a smooth projective complete intersection variety $X$ over a finite field, whose cohomology gives the $p$-adic Dwork cohomology of $X$, and its cochain endomorphism (the $p$-adic Dwork Frobenius operator) which encodes the information of the zeta function $X$. As a consequence, we give a modern deformation theoretic interpretation of Dwork's theory of the zeta function of $X$ and derive a formula for the $p$-adic Dwork Frobenius operator in terms of homotopy Lie morphisms and the Bell polynomials.

en math.NT, math.AT
arXiv Open Access 2019
Information theory for non-stationary processes with stationary increments

Carlos Granero-Belinchon, Stéphane G. Roux, Nicolas Garnier

We describe how to analyze the wide class of non stationary processes with stationary centered increments using Shannon information theory. To do so, we use a practical viewpoint and define ersatz quantities from time-averaged probability distributions. These ersatz versions of entropy, mutual information and entropy rate can be estimated when only a single realization of the process is available. We abundantly illustrate our approach by analyzing Gaussian and non-Gaussian self-similar signals, as well as multi-fractal signals. Using Gaussian signals allow us to check that our approach is robust in the sense that all quantities behave as expected from analytical derivations. Using the stationarity (independence on the integration time) of the ersatz entropy rate, we show that this quantity is not only able to fine probe the self-similarity of the process but also offers a new way to quantify the multi-fractality.

en cs.IT, cond-mat.stat-mech
arXiv Open Access 2017
Codes for Simultaneous Transmission of Quantum and Classical Information

Markus Grassl, Sirui Lu, Bei Zeng

We consider the characterization as well as the construction of quantum codes that allow to transmit both quantum and classical information, which we refer to as `hybrid codes'. We construct hybrid codes $[\![n,k{: }m,d]\!]_q$ with length $n$ and distance $d$, that simultaneously transmit $k$ qudits and $m$ symbols from a classical alphabet of size $q$. Many good codes such as $[\![7,1{: }1,3]\!]_2$, $[\![9,2{: }2,3]\!]_2$, $[\![10,3{: }2,3]\!]_2$, $[\![11,4{: }2,3]\!]_2$, $[\![11,1{: }2,4]\!]_2$, $[\![13,1{: }4,4]\!]_2$, $[\![13,1{: }1,5]\!]_2$, $[\![14,1{: }2,5]\!]_2$, $[\![15,1{: }3,5]\!]_2$, $[\![19,9{: }1,4]\!]_2$, $[\![20,9{: }2,4]\!]_2$, $[\![21,9{: }3,4]\!]_2$, $[\![22,9{: }4,4]\!]_2$ have been found. All these codes have better parameters than hybrid codes obtained from the best known stabilizer quantum codes.

en quant-ph, cs.IT
arXiv Open Access 2017
Statistical Physics and Information Theory Perspectives on Linear Inverse Problems

Junan Zhu

Many real-world problems in machine learning, signal processing, and communications assume that an unknown vector $x$ is measured by a matrix A, resulting in a vector $y=Ax+z$, where $z$ denotes the noise; we call this a single measurement vector (SMV) problem. Sometimes, multiple dependent vectors $x^{(j)}, j\in \{1,...,J\}$, are measured at the same time, forming the so-called multi-measurement vector (MMV) problem. Both SMV and MMV are linear models (LM's), and the process of estimating the underlying vector(s) $x$ from an LM given the matrices, noisy measurements, and knowledge of the noise statistics, is called a linear inverse problem. In some scenarios, the matrix A is stored in a single processor and this processor also records its measurements $y$; this is called centralized LM. In other scenarios, multiple sites are measuring the same underlying unknown vector $x$, where each site only possesses part of the matrix A; we call this multi-processor LM. Recently, due to an ever-increasing amount of data and ever-growing dimensions in LM's, it has become more important to study large-scale linear inverse problems. In this dissertation, we take advantage of tools in statistical physics and information theory to advance the understanding of large-scale linear inverse problems. The intuition of the application of statistical physics to our problem is that statistical physics deals with large-scale problems, and we can make an analogy between an LM and a thermodynamic system. In terms of information theory, although it was originally developed to characterize the theoretic limits of digital communication systems, information theory was later found to be rather useful in analyzing and understanding other inference problems. (The full abstract cannot fit in due to the space limit. Please refer to the PDF.)

en cs.IT
arXiv Open Access 2017
Bounds on Information Combining With Quantum Side Information

Christoph Hirche, David Reeb

"Bounds on information combining" are entropic inequalities that determine how the information (entropy) of a set of random variables can change when these are combined in certain prescribed ways. Such bounds play an important role in classical information theory, particularly in coding and Shannon theory; entropy power inequalities are special instances of them. The arguably most elementary kind of information combining is the addition of two binary random variables (a CNOT gate), and the resulting quantities play an important role in Belief propagation and Polar coding. We investigate this problem in the setting where quantum side information is available, which has been recognized as a hard setting for entropy power inequalities. Our main technical result is a non-trivial, and close to optimal, lower bound on the combined entropy, which can be seen as an almost optimal "quantum Mrs. Gerber's Lemma". Our proof uses three main ingredients: (1) a new bound on the concavity of von Neumann entropy, which is tight in the regime of low pairwise state fidelities; (2) the quantitative improvement of strong subadditivity due to Fawzi-Renner, in which we manage to handle the minimization over recovery maps; (3) recent duality results on classical-quantum-channels due to Renes et al. We furthermore present conjectures on the optimal lower and upper bounds under quantum side information, supported by interesting analytical observations and strong numerical evidence. We finally apply our bounds to Polar coding for binary-input classical-quantum channels, and show the following three results: (A) Even non-stationary channels polarize under the polar transform. (B) The blocklength required to approach the symmetric capacity scales at most sub-exponentially in the gap to capacity. (C) Under the aforementioned lower bound conjecture, a blocklength polynomial in the gap suffices.

en quant-ph, cs.IT
arXiv Open Access 2016
Information-Theoretic Lower Bounds for Recovery of Diffusion Network Structures

Keehwan Park, Jean Honorio

We study the information-theoretic lower bound of the sample complexity of the correct recovery of diffusion network structures. We introduce a discrete-time diffusion model based on the Independent Cascade model for which we obtain a lower bound of order $Ω(k \log p)$, for directed graphs of $p$ nodes, and at most $k$ parents per node. Next, we introduce a continuous-time diffusion model, for which a similar lower bound of order $Ω(k \log p)$ is obtained. Our results show that the algorithm of Pouget-Abadie et al. is statistically optimal for the discrete-time regime. Our work also opens the question of whether it is possible to devise an optimal algorithm for the continuous-time regime.

en cs.LG, cs.IT
arXiv Open Access 2012
The Entropy Power Inequality and Mrs. Gerber's Lemma for Abelian Groups of Order 2^n

Varun Jog, Venkat Anantharam

Shannon's Entropy Power Inequality can be viewed as characterizing the minimum differential entropy achievable by the sum of two independent random variables with fixed differential entropies. The entropy power inequality has played a key role in resolving a number of problems in information theory. It is therefore interesting to examine the existence of a similar inequality for discrete random variables. In this paper we obtain an entropy power inequality for random variables taking values in an abelian group of order 2^n, i.e. for such a group G we explicitly characterize the function f_G(x,y) giving the minimum entropy of the sum of two independent G-valued random variables with respective entropies x and y. Random variables achieving the extremum in this inequality are thus the analogs of Gaussians in this case, and these are also determined. It turns out that f_G(x,y) is convex in x for fixed y and, by symmetry, convex in y for fixed x. This is a generalization to abelian groups of order 2^n of the result known as Mrs. Gerber's Lemma.

en cs.IT, math.CO
arXiv Open Access 2012
Stein's density approach and information inequalities

Christophe Ley, Yvik Swan

We provide a new perspective on Stein's so-called density approach by introducing a new operator and characterizing class which are valid for a much wider family of probability distributions on the real line. We prove an elementary factorization property of this operator and propose a new Stein identity which we use to derive information inequalities in terms of what we call the \emph{generalized Fisher information distance}. We provide explicit bounds on the constants appearing in these inequalities for several important cases. We conclude with a comparison between our results and known results in the Gaussian case, hereby improving on several known inequalities from the literature.

en math.PR, cs.IT
arXiv Open Access 2011
A Partial Order on Uncertainty and Information

Jiahua Chen

Information and uncertainty are closely related and extensively studied concepts in a number of scientific disciplines such as communication theory, probability theory, and statistics. Increasing the information arguably reduces the uncertainty on a given random subject. Consider the uncertainty measure as the variance of a random variable. Given the information that its outcome is in an interval, the uncertainty is expected to reduce when the interval shrinks. This proposition is not generally true. In this paper, we provide a necessary and sufficient condition for this proposition when the random variable is absolutely continuous or integer valued. We also give a similar result on Shannon information.

en math.PR

Halaman 51 dari 268986