Maximum likelihood estimation of the Latent Class Model through model boundary decomposition

  • Elizabeth Allman Department of Mathematics and Statistics, University of Alaska, Fairbanks
  • Hector Banos Cervantes Department of Mathematics and Statistics, University of Alaska, Fairbanks
  • Robin Evans Department of Statistics, University of Oxford
  • Serkan Hosten San Francisco State University Mathematics Department
  • Kaie Kubjas Laboratory for Information & Decision Systems, Massachusetts Institute of Technology; Laboratoire d'Informatique, Universite Pierre et Marie Curie
  • Daniel Lemke San Francisco State University Mathematics Department
  • John Rhodes Department of Mathematics and Statistics, University of Alaska, Fairbanks
  • Piotr Zwiernik Department of Economics and Business, Universitat Pompeu Fabra, Barcelona
Keywords: Maximum likelihood estimation, latent class model, tensor rank

Abstract

The Expectation-Maximization (EM) algorithm is routinely used for
the maximum likelihood estimation in the latent class analysis.
However, the EM algorithm comes with no guarantees of reaching the
global optimum. We study the geometry of the latent class model in
order to understand the behavior of the maximum likelihood
estimator. In particular, we characterize the boundary
stratification of the binary latent class model with a binary hidden
variable. For small models, such as for three binary observed
variables, we show that this stratification allows exact computation
of the maximum likelihood estimator.
In this case we use simulations to study the maximum likelihood estimation attraction basins
of the various strata. Our theoretical study is complemented with a
careful analysis of the EM fixed point ideal which provides an
alternative method of studying the boundary stratification and maximizing the likelihood function. In particular, we compute
the minimal primes of this ideal in the case of a binary
latent class model with a binary or ternary hidden random
variable.

Published
2019-04-10
Section
Special Volume in honor of memory of S.E.Fienberg