Implications of reference priors for prior information and for sample size

Bertrand S Clarke

Research output: Contribution to journalArticle

24 Citations (Scopus)

Abstract

Here we use posterior densities based on relative entropy reference priors for two purposes. The first purpose is to identify data implicit in the use of informative priors. We represent an informative prior as the posterior from an experiment with a known likelihood and a reference prior. Minimizing the relative entropy distance between this posterior and the informative prior over choices of data results in a data set that can be regarded as representative of the information in the informative prior. The second implication from reference priors is obtained by replacing the informative prior with a class of densities from which one might wish to make inferences. For each density in this class, one can obtain a data set that minimizes a relative entropy. The maximum of these sample sizes as the inferential density varies over its class can be used as a guess as to how much data is required for the desired inferences. We bound this sample size above and below by other techniques that permit it to be approximated.

Original languageEnglish (US)
Pages (from-to)173-184
Number of pages12
JournalJournal of the American Statistical Association
Volume91
Issue number433
DOIs
StatePublished - Mar 1 1996

Fingerprint

Reference Prior
Prior Information
Sample Size
Relative Entropy
Guess
Likelihood
Sample size
Prior information
Reference prior
Vary
Minimise
Relative entropy
Experiment
Class
Inference

Keywords

  • Asymptotic normality
  • Experimental design
  • Information
  • Relative entropy

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Cite this

Implications of reference priors for prior information and for sample size. / Clarke, Bertrand S.

In: Journal of the American Statistical Association, Vol. 91, No. 433, 01.03.1996, p. 173-184.

Research output: Contribution to journalArticle

@article{cacf74f71f97480d8b47fe21c1b8adbf,
title = "Implications of reference priors for prior information and for sample size",
abstract = "Here we use posterior densities based on relative entropy reference priors for two purposes. The first purpose is to identify data implicit in the use of informative priors. We represent an informative prior as the posterior from an experiment with a known likelihood and a reference prior. Minimizing the relative entropy distance between this posterior and the informative prior over choices of data results in a data set that can be regarded as representative of the information in the informative prior. The second implication from reference priors is obtained by replacing the informative prior with a class of densities from which one might wish to make inferences. For each density in this class, one can obtain a data set that minimizes a relative entropy. The maximum of these sample sizes as the inferential density varies over its class can be used as a guess as to how much data is required for the desired inferences. We bound this sample size above and below by other techniques that permit it to be approximated.",
keywords = "Asymptotic normality, Experimental design, Information, Relative entropy",
author = "Clarke, {Bertrand S}",
year = "1996",
month = "3",
day = "1",
doi = "10.1080/01621459.1996.10476674",
language = "English (US)",
volume = "91",
pages = "173--184",
journal = "Journal of the American Statistical Association",
issn = "0162-1459",
publisher = "Taylor and Francis Ltd.",
number = "433",

}

TY - JOUR

T1 - Implications of reference priors for prior information and for sample size

AU - Clarke, Bertrand S

PY - 1996/3/1

Y1 - 1996/3/1

N2 - Here we use posterior densities based on relative entropy reference priors for two purposes. The first purpose is to identify data implicit in the use of informative priors. We represent an informative prior as the posterior from an experiment with a known likelihood and a reference prior. Minimizing the relative entropy distance between this posterior and the informative prior over choices of data results in a data set that can be regarded as representative of the information in the informative prior. The second implication from reference priors is obtained by replacing the informative prior with a class of densities from which one might wish to make inferences. For each density in this class, one can obtain a data set that minimizes a relative entropy. The maximum of these sample sizes as the inferential density varies over its class can be used as a guess as to how much data is required for the desired inferences. We bound this sample size above and below by other techniques that permit it to be approximated.

AB - Here we use posterior densities based on relative entropy reference priors for two purposes. The first purpose is to identify data implicit in the use of informative priors. We represent an informative prior as the posterior from an experiment with a known likelihood and a reference prior. Minimizing the relative entropy distance between this posterior and the informative prior over choices of data results in a data set that can be regarded as representative of the information in the informative prior. The second implication from reference priors is obtained by replacing the informative prior with a class of densities from which one might wish to make inferences. For each density in this class, one can obtain a data set that minimizes a relative entropy. The maximum of these sample sizes as the inferential density varies over its class can be used as a guess as to how much data is required for the desired inferences. We bound this sample size above and below by other techniques that permit it to be approximated.

KW - Asymptotic normality

KW - Experimental design

KW - Information

KW - Relative entropy

UR - http://www.scopus.com/inward/record.url?scp=0009064192&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0009064192&partnerID=8YFLogxK

U2 - 10.1080/01621459.1996.10476674

DO - 10.1080/01621459.1996.10476674

M3 - Article

VL - 91

SP - 173

EP - 184

JO - Journal of the American Statistical Association

JF - Journal of the American Statistical Association

SN - 0162-1459

IS - 433

ER -