Skip to yearly menu bar Skip to main content


Poster

Mean Estimation in High-Dimensional Binary Markov Gaussian Mixture Models

Yihan Zhang · Nir Weinberger

Hall J (level 1) #822

Keywords: [ minimax rate ] [ high-dimensional statistics ] [ parameter estimation ] [ spectral estimator ] [ hidden Markov model ]


Abstract: We consider a high-dimensional mean estimation problem over a binary hidden Markov model, which illuminates the interplay between memory in data, sample size, dimension, and signal strength in statistical inference. In this model, an estimator observes nn samples of a dd-dimensional parameter vector θRd, multiplied by a random sign Si (1in), and corrupted by isotropic standard Gaussian noise. The sequence of signs {Si}i[n]{1,1}n is drawn from a stationary homogeneous Markov chain with flip probability δ[0,1/2]. As δ varies, this model smoothly interpolates two well-studied models: the Gaussian Location Model for which δ=0 and the Gaussian Mixture Model for which δ=1/2. Assuming that the estimator knows δ, we establish a nearly minimax optimal (up to logarithmic factors) estimation error rate, as a function of θ,δ,d,n. We then provide an upper bound to the case of estimating δ, assuming a (possibly inaccurate) knowledge of θ. The bound is proved to be tight when θ is an accurately known constant. These results are then combined to an algorithm which estimates θ with δ unknown a priori, and theoretical guarantees on its error are stated.

Chat is not available.