arXiv Open Access 2025

Cross Mutual Information

Chetan Gohil Oliver M Cliff James M. Shine Ben D. Fulcher Joseph T. Lizier
Lihat Sumber

Abstrak

Mutual information (MI) is a useful information-theoretic measure to quantify the statistical dependence between two random variables: $X$ and $Y$. Often, we are interested in understanding how the dependence between $X$ and $Y$ in one set of samples compares to another. Although the dependence between $X$ and $Y$ in each set of samples can be measured separately using MI, these estimates cannot be compared directly if they are based on samples from a non-stationary distribution. Here, we propose an alternative measure for characterising how the dependence between $X$ and $Y$ as defined by one set of samples is expressed in another, \textit{cross mutual information}. We present a comprehensive set of simulation studies sampling data with $X$-$Y$ dependencies to explore this measure. Finally, we discuss how this relates to measures of model fit in linear regression, and some future applications in neuroimaging data analysis.

Topik & Kata Kunci

Penulis (5)

C

Chetan Gohil

O

Oliver M Cliff

J

James M. Shine

B

Ben D. Fulcher

J

Joseph T. Lizier

Format Sitasi

Gohil, C., Cliff, O.M., Shine, J.M., Fulcher, B.D., Lizier, J.T. (2025). Cross Mutual Information. https://arxiv.org/abs/2507.15372

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓