Unsupervised Learning of Gamma Mixture Models using Minimum Message Length

Y. Agusta and D.L. Dowe (Australia)

Keywords

Unsupervised Classification, Mixture Modelling, MML, Gamma

Abstract

Mixture modelling or unsupervised classification is a prob lem of identifying and modelling components in a body of data. Earlier work in mixture modelling using Min imum Message Length (MML) includes the multinomial and Gaussian distributions (Wallace and Boulton, 1968), the von Mises circular and Poisson distributions (Wallace and Dowe, 1994, 2000) and the t distribution (Agusta and Dowe, 2002a, 2002b). In this paper, we extend this re search by considering MML mixture modelling using the Gamma distribution. The point estimation of the distri bution was performed using the MML approximation pro posed by Wallace and Freeman (1987) and gives impres sive results compared to Maximum Likelihood (ML). We then considered mixture modelling on artificially gener ated datasets and compared the results with two other cri teria, AIC and BIC. In terms of the resulting number of components, the results were again impressive. Applica tion to the Heming Pike dataset was then examined and the results were compared in terms of the probability bit costings, showing that the proposed MML method per forms better than AIC and BIC. A further application also shows that our method works well with datasets contain ing left-skewed components such as the Palm Valley (Aus tralia) image dataset.

Important Links:



Go Back