Predictive learning on hidden tree-structured ising models

Konstantinos E. Nikolakakis, Dionysios S. Kalogerias, Anand D. Sarwate

Research output: Contribution to journalArticlepeer-review


We provide high-probability sample complexity guarantees for exact structure recovery and accurate predictive learning using noise-corrupted samples from an acyclic (tree-shaped) graphical model. The hidden variables follow a tree-structured Ising model distribution, whereas the observable variables are generated by a binary symmetric channel taking the hidden variables as its input (flipping each bit independently with some constant probability q ∈ [0, 1/2)). In the absence of noise, predictive learning on Ising models was recently studied by Bresler and Karzand (2020); this paper quantifies how noise in the hidden model impacts the tasks of structure recovery and marginal distribution estimation by proving upper and lower bounds on the sample complexity. Our results generalize state-of-the-art bounds reported in prior work, and they exactly recover the noiseless case (q = 0). In fact, for any tree with p vertices and probability of incorrect recovery δ > 0, the sufficient number of samples remains logarithmic as in the noiseless case, i.e., O(log(p/δ)), while the dependence on q is O(1/(1 - 2q)4, for both aforementioned tasks. We also present a new equivalent of Isserlis’ Theorem for sign-valued tree-structured distributions, yielding a new low-complexity algorithm for higher-order moment estimation.

Original languageEnglish (US)
JournalJournal of Machine Learning Research
StatePublished - 2021

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence


  • Chow-Liu Algorithm
  • Distribution Estimation
  • Hidden Markov Random Fields
  • Ising Model
  • Noisy Data
  • Predictive Learning
  • Structure Learning


Dive into the research topics of 'Predictive learning on hidden tree-structured ising models'. Together they form a unique fingerprint.

Cite this