作者简介 Andrew Gelman是哥伦比亚大学统计学院的教授,应用统计学中心主任。他曾获得美国统计协会颁发的杰出统计应用奖、《美国政治科学评论》发表的很好文章奖,以及统计学会主席理事会颁发的40岁以下人士杰出贡献奖。他的著作包括贝叶斯数据分析(与约翰•卡林、哈尔•斯特恩、大卫•邓森、阿基•维塔里和唐•鲁宾合著)、教学统计学等。
目录 Preface
Part I: Fundamentals of Bayesian Inference
1 Probability and inference
I.I The three steps of Bayesian data analysis
1.2 General notation for statistical inference
1.3 Bayesian inference
1.4 Discrete examples: genetics and spell checking
1.5 Probability as a measure of uncertainty
1.6 Example: probabilities from football point spreads
1.7 Example: calibration for record linkage
1.8 Some useful results from probability theory
1.9 Computation and software
I.I0 Bayesian inference in applied statistics
i.Ii Bibliographic note
1.12 Exercises
2 Single-parameter models
2.1 Estimating a probability from binomial data
2.2 Posterior as compromise between data and prior information
2.3 Summarizing posterior inference
2.4 Informative prior distributions
2.5 Normal distribution with known variance
2.6 Other standard single-parameter models
2.7 Example: informative prior distribution for cancer rates
2.8 Noninformative prior distributions
2.9 Weakly informative prior distributions
2.10 Bibliographic note
2.11 Exercises
3 Introduction to multiparameter models
3.1 Averaging over nuisance parameters
3.2 Normal data with a noninformative prior distribution
3.3 Normal data with a conjugate prior distribution
3.4 Multinomial model for categorical data
3.5 Multivariate normal model with known variance
3.6 Multivariate normal with unknown mean and variance
3.7 Example: analysis of a bioassay experiment
3.8 Summary of elementary modeling and computation
3.9 Bibliographic note
3.10 Exercises
4 Asymptotics and connections to non-Bayesian approaches
4.1 Normal approximations to the posterior distribution
4.2 Large-sample theory
4.3 Counterexamples to the theorems
4.4 Frequency evaluations of Bayesian inferences
4.5 Bayesian interpretations of other statistical methods
4.6 Bibliographic note
4.7 Exercises
5 Hierarchical models
5.1 Constructing a parameterized prior distribution
5.2 Exchangeability and hierarchical models
5.3 Bayesian analysis of conjugate hierarchical models
5.4 Normal model with exchangeable parameters
5.5 Example: parallel experiments in eight schools
5.6 Hierarchical modeling applied to a meta-analysis
5.7 Weakly informative priors for variance parameters
5.8 Bibliographic note
5.9 Exercises
Part II: Fundamentals of Bayesian Data Analysis
6 Model checking
6.1 The place of model checking in applied Bayesian statistics
6.2 Do the inferences from the model make sense?
6.3 Posterior predictive checking
6.4 Graphical posterior predictive checks
6.5 Model checking for the educational testing example
6.6 Bibliographic note
6.7 Exercises
? Evaluating, comparing, and expanding models
7.1 Measures of predictive accuracy
7.2 Information criteria and cross-validation
7.3 Model comparison based on predictive performance
7.4 Model comparison using Bayes factors
7.5 Continuous model expansion
7.6 Implicit assumptions and model expansion: an example
7.7 Bibliographic note
7.8 Exercises
8 Modeling accounting for data collection
8.1 Bayesian inference requires a model for data collection
8.2 Data-collection models and ignorability
8.3 Sample surveys
8.4 Designed experiments
8.5 Sensitivity and the role of randomization
8.6 Observational studies
8.7 Censoring and truncation
8.8 Discussion
8.9 Bibliographic note
8.10 Exercises
9 Decision analysis
9.1 Bayesian decision theory in different contexts
9.2 Using regression predictions: survey incentives
9.3 Multistage decision making: medical screening
9.4 Hierarchical decision analysis for home radon
9.5 Personal vs. institutional decision analysis
9.6 Bibliographic note
9.7 Exercises
Part III: Advanced Computation
10 Introduction to Bayesian computation
10.1 Numerical integration
10.2 Distributional approximations
10.3 Direct simulation and rejection sampling
10.4 Importance sampling
10.5 How many simulation draws are needed?
10.6 Computing environments
10.7 Debugging Bayesian computing
10.8 Bibliographic note
10.9 Exercises
11 Basics of Markov chain simulation
11.1 Gibbs sampler
11.2 Metropolis and Metropolis-Hastings algorithms
11.3 Using Gibbs and Metropolis as building blocks
以下为对购买帮助不大的评价