Parsimonious mixtures for the analysis of tensor-variate data

Salvatore D. Tomarchio*, Antonio Punzo, Luca Bagnato

*Corresponding author

Research output: Contribution to journalArticle

Abstract

Real data is taking on more and more complex structures, raising the necessity for more flexible and parsimonious statistical methodologies. Tensor-variate (or multi-way) structures are a typical example of such kind of data. Unfortunately, real data often present atypical observations that make the traditional normality assumption inadequate. Thus, in this paper, we first introduce two new tensor-variate distributions, both heavy-tailed generalizations of the tensor-variate normal distribution. Then, we use these distributions for model-based clustering via finite mixture models. To introduce parsimony in the models, we use the eigen-decomposition of the components’ scale matrices, obtaining two families of parsimonious tensor-variate mixture models. As a by-product, we also introduce the parsimonious version of tensor-variate normal mixtures. As for parameter estimation, we illustrate variants of the well-known EM algorithm. Since the number of parsimonious models depends on the order of the tensors, we implement strategies intending to shorten the initialization and fitting processes. These procedures are investigated via simulated analyses. Finally, we fitted our parsimonious models to two real datasets having a 4-way and a 5-way structure, respectively.
Original languageEnglish
Pages (from-to)1-27
Number of pages27
JournalStatistics and Computing
Volume33
DOIs
Publication statusPublished - 2023

Keywords

  • Tensor-variate mixtures
  • Model-based clustering
  • Tensor-data

Fingerprint

Dive into the research topics of 'Parsimonious mixtures for the analysis of tensor-variate data'. Together they form a unique fingerprint.

Cite this