DOAJ Open Access 2022

Nonparametric Causal Structure Learning in High Dimensions

Shubhadeep Chakraborty Ali Shojaie

Abstrak

The PC and FCI algorithms are popular constraint-based methods for learning the structure of directed acyclic graphs (DAGs) in the absence and presence of latent and selection variables, respectively. These algorithms (and their order-independent variants, PC-stable and FCI-stable) have been shown to be consistent for learning sparse high-dimensional DAGs based on partial correlations. However, inferring conditional independences from partial correlations is valid if the data are jointly Gaussian or generated from a linear structural equation model—an assumption that may be violated in many applications. To broaden the scope of high-dimensional causal structure learning, we propose nonparametric variants of the PC-stable and FCI-stable algorithms that employ the conditional distance covariance (CdCov) to test for conditional independence relationships. As the key theoretical contribution, we prove that the high-dimensional consistency of the PC-stable and FCI-stable algorithms carry over to general distributions over DAGs when we implement CdCov-based nonparametric tests for conditional independence. Numerical studies demonstrate that our proposed algorithms perform nearly as good as the PC-stable and FCI-stable for Gaussian distributions, and offer advantages in non-Gaussian graphical models.

Penulis (2)

S

Shubhadeep Chakraborty

A

Ali Shojaie

Format Sitasi

Chakraborty, S., Shojaie, A. (2022). Nonparametric Causal Structure Learning in High Dimensions. https://doi.org/10.3390/e24030351

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber doi.org/10.3390/e24030351
Informasi Jurnal
Tahun Terbit
2022
Sumber Database
DOAJ
DOI
10.3390/e24030351
Akses
Open Access ✓