Tong Zhang
ACM COLT 1999
We consider a model of learning in which the successive observations follow a certain Markov chain. The observations are labeled according to a membership to some unknown target set. For a Markov chain with finitely many states we show that, if the target set belongs to a family of sets with finite VC dimension, then probably approximately correct learning of this set is possible with polynomially large samples. Specifically for observations following a random walk with a state space X and uniform stationary distribution, the sample size required is no more than Ω (to/1-λ2 log(t0|X|1/δ)), where t0 is the sample size sufficient for learning from i.i.d. observations, δ is the confidence level and λ2 is the second largest eigenvalue of the transition matrix. We extend these results to Markov chains with countably many states using Lyapunov function technique and recent results on mixing properties of infinite state Markov chains.
Tong Zhang
ACM COLT 1999
David Gamarnik
ACM COLT 1998
David Gamarnik
SODA 2000
David Gamarnik
FOCS 1998