Review Articles

beta-divergence loss for the kernel density estimation with bias reduced

Hamza Dhaker ,

aMathématiques et statistique, Universite de Moncton, Moncton, Canada

El Hadji Deme ,

bUFR SAT, Universite Gaston Berger, Saint-Louis, Senegal

Youssou Ciss

bUFR SAT, Universite Gaston Berger, Saint-Louis, Senegal

Pages 221-231 | Received 26 Oct. 2019, Accepted 30 Nov. 2020, Published online: 14 Dec. 2020,
  • Abstract
  • Full Article
  • References
  • Citations

ABSTRACT

In this paper, we investigate the problem of estimating the probability density function. The kernel density estimation with bias reduced is nowadays a standard technique in explorative data analysis, there is still a big dispute on how to assess the quality of the estimate and which choice of bandwidth is optimal. This framework examines the most important bandwidth selection methods for kernel density estimation in the context of with bias reduction. Normal reference, least squares cross-validation, biased cross-validation and beta-divergence loss methods are described and expressions are presented. In order to assess the performance of our various bandwidth selectors, numerical simulations and environmental data are carried out.

To cite this article: Hamza Dhaker, El Hadji Deme & Youssou Ciss (2021) β-divergence loss for the kernel density estimation with bias reduced, Statistical Theory and Related Fields, 5:3, 221-231, DOI: 10.1080/24754269.2020.1858630

To link to this article: https://doi.org/10.1080/24754269.2020.1858630