## Abstract

We consider the problem of estimating a density function from a sequence of N independent and identically distributed observations X(i) taking value in X c R(d). The estimation procedure constructs a convex mixture of 'basis' densities and estimates the parameters using the maximum likelihood method. Viewing the error as a combination of two terms, the approximation error measuring the adequacy of the model, and the estimation error resulting from the finiteness of the sample size, we derive upper bounds to the expected total error, thus obtaining bounds for the rate of convergence. These results then allow us to derive explicit expressions relating the sample complexity and model complexity.

Original language | English |
---|---|

Pages (from-to) | 99-109 |

Number of pages | 11 |

Journal | Neural Networks |

Volume | 10 |

Issue number | 1 |

DOIs | |

State | Published - Jan 1997 |

Externally published | Yes |

## Keywords

- approximation error
- convergence rates
- density estimation
- maximum likelihood
- mixture models