We study the trade-off between efficiency and robustness of the estimators obtained by minimizing the divergence statistics and their adjoins; obtained by minimizing the asymmetric counterparts of the divergence statistics. In particular, it is shown that no minimum power-divergence estimator is better than the minimum Hellinger distance estimator in terms of both second-order efficiency and robustness
© 2001-2024 Fundación Dialnet · Todos los derechos reservados