Download PDFOpen PDF in browserMixtures of Normalizing Flows9 pages•Published: November 2, 2021AbstractNormalizing flows fall into the category of deep generative models. They explicitly model a probability density function. As a result, such a model can learn probabilistic distributions beyond the Gaussian one. Clustering is one of the main unsupervised ma- chine learning tasks and the most common probabilistic approach to solve a clustering problem is via Gaussian mixture models. Although there are a few approaches for con- structing mixtures of normalizing flows in the literature, we propose a direct approach and use the masked autoregressive flow as the normalizing flow. We show the results obtained on 2D datasets and then on images. The results contain density plots or ta- bles with clustering metrics in order to quantify the quality of the obtained clusters. Although on images we usually obtain worse results than other classic models, the 2D results show that more expressive mixtures of distributions (than the Gaussian mixture models) can be learned indeed. The code which implements this method can be found at https://github.com/aciobanusebi/nf-mixture.Keyphrases: clustering, machine learning, mixture models, mixtures of normalizing flows, normalizing flows In: Yan Shi, Gongzhu Hu, Quan Yuan and Takaaki Goto (editors). Proceedings of ISCA 34th International Conference on Computer Applications in Industry and Engineering, vol 79, pages 82-90.
|