Chapter 1 is a whirlwind tour of ideas we will use throughout our continuing study of mathematical statistics. Skim this chapter, and the problems. (Just *skim* -- read quickly, without necessarily working through all the details! Keep track of passages which you don't fully understand, but don't allow these passages to deter you from finishing the reading. Familiarity with the ideas is important. We will highlight important details in class or with homework, and some of the details we will return to when needed ...) In Chapter 1 we must refresh our memory about random variables, distributions, parameters and models, identifiability, testing and estimation, order statistics and the empirical distribution function, Bayes Theorem, priors, posteriors and conjugacy, loss functions, decision procedures and risk functions, admissibility, randomization, prediction, sufficient statistics, likelihood, and exponential families. ----- Some recommended homework problems for chapter 1: 1.1.1 -- elementary probability model examples 1.1.4(a) -- stochastic ordering 1.2.1 -- elementary bayes rule example 1.2.3 -- bayes rule: geometric & beta 1.2.6 -- conjugate familes: Poisson & gamma 1.2.15 -- Dirichlet as conjugate prior for multinomial 1.3.1 -- elementary decision theory example 1.3.11(a) -- unbiasedness for decision rules coincides with defn for estimates 1.4.1 -- balls, urns, & predictors 1.4.20(b) -- MSPE of optimal predictor of Y based on X 1.5.1 -- sufficiency in Poisson model (directly, & via factorization thm) 1.5.8 -- sufficiency of order statistics 1.5.12 -- minimal sufficiency & likelihood 1.6.1 -- natural sufficient statistics & parameters for common one-param EFs 1.6.5 -- two-parameter EFs: beta & gamma 1.6.6 -- Dirichlet as exponential family 1.6.19 -- logistic regression 1.6.27 -- we may choose h in EF 1.6.28(b)(ii) -- gaussian is max entropy distr 1.6.31 -- normal mixtures: hierarchical Bayes 1.6.32 -- binomial-beta: hierarchical Bayes 1.6.34 -- stationary Markov chain as exponential family