We show that the relative entropy between a posterior density formed from a smooth likelihood and prior and a limiting normal form tends to zero in the independent and identically distributed case. The mode of convergence is in probability and in mean. Applications to codelengths in stochastic complexity and to sample size selection are briefly discussed.
- Asymptotic normality
- Posterior density
- Relative entropy
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences