Abstract We define a new class of random probability measures, approximating the well-known normalized generalized gamma (NGG) process. Our new process is defined from the representation of NGG processes as discrete measures where the weights are obtained by normalization of the jumps of Poisson processes and the support consists of independent identically distributed location points, however considering only jumps larger than a threshold ε. Therefore, the number of jumps of the new process, called ε-NGG process, is a.s. finite. A prior distribution for ε can be elicited. We assume such a process as the mixing measure in a mixture model for density and cluster estimation, and build an efficient Gibbs sampler scheme to simulate from the posterior. Finally, we discuss applications and performance of the model to two popular datasets, as well as comparison with competitor algorithms, the slice sampler and a posteriori truncation.
- Keywords Bayesian nonparametric mixture models, Normalized generalized gamma process, Blocked Gibbs sampler, Finite dimensional approximation, A priori truncation method