### Abstract

We begin by recalling the tripartite division of statistical problems into three classes, M-closed, M-complete, and M-open and then reviewing the key ideas of introductory Shannon theory. Focusing on the related but distinct goals of model selection and prediction, we argue that different techniques for these two goals are appropriate for the three different problem classes. For M-closed problems we give relative entropy justification that the Bayes information criterion (BIC) is appropriate for model selection and that the Bayes model average is information optimal for prediction. For M-complete problems, we discuss the principle of maximum entropy and a way to use the rate distortion function to bypass the inaccessibility of the true distribution. For prediction in the M-complete class, there is little work done on information based model averaging so we discuss the Akaike information criterion (AIC) and its properties and variants. For the M-open class, we argue that essentially only predictive criteria are suitable. Thus, as an analog to model selection, we present the key ideas of prediction along a string under a codelength criterion and propose a general form of this criterion. Since little work appears to have been done on information methods for general prediction in the M-open class of problems, we mention the field of information theoretic learning in certain general function spaces.

Original language | English (US) |
---|---|

Pages (from-to) | 337-371 |

Number of pages | 35 |

Journal | Econometric Reviews |

Volume | 33 |

Issue number | 1-4 |

DOIs | |

State | Published - Feb 1 2014 |

### Fingerprint

### Keywords

- Bayesian
- Codelength
- Entropy
- Information theory
- M-closed
- M-complete
- M-open
- Model selection
- Mutual information
- Prediction
- Rate distortion
- Relative entropy

### ASJC Scopus subject areas

- Economics and Econometrics

### Cite this

*Econometric Reviews*,

*33*(1-4), 337-371. https://doi.org/10.1080/07474938.2013.807190

**Statistical Problem Classes and Their Links to Information Theory.** / Clarke, Bertrand; Clarke, Jennifer; Yu, Chi Wai.

Research output: Contribution to journal › Article

*Econometric Reviews*, vol. 33, no. 1-4, pp. 337-371. https://doi.org/10.1080/07474938.2013.807190

}

TY - JOUR

T1 - Statistical Problem Classes and Their Links to Information Theory

AU - Clarke, Bertrand

AU - Clarke, Jennifer

AU - Yu, Chi Wai

PY - 2014/2/1

Y1 - 2014/2/1

N2 - We begin by recalling the tripartite division of statistical problems into three classes, M-closed, M-complete, and M-open and then reviewing the key ideas of introductory Shannon theory. Focusing on the related but distinct goals of model selection and prediction, we argue that different techniques for these two goals are appropriate for the three different problem classes. For M-closed problems we give relative entropy justification that the Bayes information criterion (BIC) is appropriate for model selection and that the Bayes model average is information optimal for prediction. For M-complete problems, we discuss the principle of maximum entropy and a way to use the rate distortion function to bypass the inaccessibility of the true distribution. For prediction in the M-complete class, there is little work done on information based model averaging so we discuss the Akaike information criterion (AIC) and its properties and variants. For the M-open class, we argue that essentially only predictive criteria are suitable. Thus, as an analog to model selection, we present the key ideas of prediction along a string under a codelength criterion and propose a general form of this criterion. Since little work appears to have been done on information methods for general prediction in the M-open class of problems, we mention the field of information theoretic learning in certain general function spaces.

AB - We begin by recalling the tripartite division of statistical problems into three classes, M-closed, M-complete, and M-open and then reviewing the key ideas of introductory Shannon theory. Focusing on the related but distinct goals of model selection and prediction, we argue that different techniques for these two goals are appropriate for the three different problem classes. For M-closed problems we give relative entropy justification that the Bayes information criterion (BIC) is appropriate for model selection and that the Bayes model average is information optimal for prediction. For M-complete problems, we discuss the principle of maximum entropy and a way to use the rate distortion function to bypass the inaccessibility of the true distribution. For prediction in the M-complete class, there is little work done on information based model averaging so we discuss the Akaike information criterion (AIC) and its properties and variants. For the M-open class, we argue that essentially only predictive criteria are suitable. Thus, as an analog to model selection, we present the key ideas of prediction along a string under a codelength criterion and propose a general form of this criterion. Since little work appears to have been done on information methods for general prediction in the M-open class of problems, we mention the field of information theoretic learning in certain general function spaces.

KW - Bayesian

KW - Codelength

KW - Entropy

KW - Information theory

KW - M-closed

KW - M-complete

KW - M-open

KW - Model selection

KW - Mutual information

KW - Prediction

KW - Rate distortion

KW - Relative entropy

UR - http://www.scopus.com/inward/record.url?scp=84885340837&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84885340837&partnerID=8YFLogxK

U2 - 10.1080/07474938.2013.807190

DO - 10.1080/07474938.2013.807190

M3 - Article

AN - SCOPUS:84885340837

VL - 33

SP - 337

EP - 371

JO - Econometric Reviews

JF - Econometric Reviews

SN - 0747-4938

IS - 1-4

ER -