- Browse
- » Linear models in statistics
Linear models in statistics
Author
Publisher
Wiley-Interscience
Publication Date
c2008
Language
English
Description
Loading Description...
Table of Contents
From the Book - 2nd ed.
Preface
1. Introduction
1.1. Simple Linear Regression Model
1.2. Multiple Linear Regression Model
1.3. Analysis-of-Variance Models
2. Matrix Algebra
2.1. Matrix and Vector Notation
2.1.1. Matrices, Vectors, and Scalars
2.1.2. Matrix Equality
2.1.3. Transpose
2.1.4. Matrices of Special Form
2.2. Operations
2.2.1. Sum of Two Matrices or Two Vectors
2.2.2. Product of a Scalar and a Matrix
2.2.3. Product of Two Matrices or Two Vectors
2.2.4. Hadamard Product of Two Matrices or Two Vectors
2.3. Partitioned Matrices
2.4. Rank
2.5. Inverse
2.6. Positive Definite Matrices
2.7. Systems of Equations
2.8. Generalized Inverse
2.8.1. Definition and Properties
2.8.2. Generalized Inverses and Systems of Equations
2.9. Determinants
2.10. Orthogonal Vectors and Matrices
2.11. Trace
2.12. Eigenvalues and Eigenvectors
2.12.1. Definition
2.12.2. Functions of a Matrix
2.12.3. Products
2.12.4. Symmetric Matrices
2.12.5. Positive Definite and Semidefinite Matrices
2.13. Idempotent Matrices
2.14. Vector and Matrix Calculus
2.14.1. Derivatives of Functions of Vectors and Matrices
2.14.2. Derivatives Involving Inverse Matrices and Determinants
2.14.3. Maximization or Minimization of a Function of a Vector
3. Random Vectors and Matrices
3.1. Introduction
3.2. Means, Variances, Covariances, and Correlations
3.3. Mean Vectors and Covariance Matrices for Random Vectors
3.3.1. Mean Vectors
3.3.2. Covariance Matrix
3.3.3. Generalized Variance
3.3.4. Standardized Distance
3.4. Correlation Matrices
3.5. Mean Vectors and Covariance Matrices for Partitioned Random Vectors
3.6. Linear Functions of Random Vectors
3.6.1. Means
3.6.2. Variances and Covariances
4. Multivariate Normal Distribution
4.1. Univariate Normal Density Function
4.2. Multivariate Normal Density Function
4.3. Moment Generating Functions
4.4. Properties of the Multivariate Normal Distribution
4.5. Partial Correlation
5. Distribution of Quadratic Forms in y
5.1. Sums of Squares
5.2. Mean and Variance of Quadratic Forms
5.3. Noncentral Chi-Square Distribution
5.4. Noncentral F and t Distributions
5.4.1. Noncentral F Distribution
5.4.2. Noncentral t Distribution
5.5. Distribution of Quadratic Forms
5.6. Independence of Linear Forms and Quadratic Forms
6. Simple Linear Regression
6.1. The Model
6.2. Estimation of [beta subscript 0], [beta subscript 1], and [sigma superscript 2]
6.3. Hypothesis Test and Confidence Interval for [beta subscript 1]
6.4. Coefficient of Determination
7. Multiple Regression: Estimation
7.1. Introduction
7.2. The Model
7.3. Estimation of [beta] and [sigma superscript 2]
7.3.1. Least-Squares Estimator for [beta]
7.3.2. Properties of the Least-Squares Estimator [beta]
7.3.3. An Estimator for [sigma superscript 2]
7.4. Geometry of Least-Squares
7.4.1. Parameter Space, Data Space, and Prediction Space
7.4.2. Geometric Interpretation of the Multiple Linear Regression Model
7.5. The Model in Centered Form
7.6. Normal Model
7.6.1. Assumptions
7.6.2. Maximum Likelihood Estimators for [beta] and [sigma superscript 2]
7.6.3. Properties of [beta] and [sigma superscript 2]
7.7. R[superscript 2] in Fixed-x Regression
7.8. Generalized Least-Squares: cov(y) = [sigma superscript 2]V
7.8.1. Estimation of [beta] and [sigma superscript 2] when cov(y) = [sigma superscript 2]V
7.8.2. Misspecification of the Error Structure
7.9. Model Misspecification
7.10. Orthogonalization
8. Multiple Regression: Tests of Hypotheses and Confidence Intervals
8.1. Test of Overall Regression
8.2. Test on a Subset of the [beta] Values
8.3. F Test in Terms of R[superscript 2]
8.4. The General Linear Hypothesis Tests for H[subscript 0]: C[beta] = 0 and H[subscript 0]: C[beta] = t
8.4.1. The Test for H[subscript 0]: C[beta] = 0
8.4.2. The Test for H[subscript 0]: C[beta] = t
8.5. Tests on [beta subscript j] and a' [beta]
8.5.1. Testing One [beta subscript j] or One a' [beta]
8.5.2. Testing Several [beta subscript j] or a'[subscript i beta] Values
8.6. Confidence Intervals and Prediction Intervals
8.6.1. Confidence Region for [beta]
8.6.2. Confidence Interval for [beta subscript j]
8.6.3. Confidence Interval for a'[beta]
8.6.4. Confidence Interval for E(y)
8.6.5. Prediction Interval for a Future Observation
8.6.6. Confidence Interval for [sigma superscript 2]
8.6.7. Simultaneous Intervals
8.7. Likelihood Ratio Tests
9. Multiple Regression: Model Validation and Diagnostics
9.1. Residuals
9.2. The Hat Matrix
9.3. Outliers
9.4. Influential Observations and Leverage
10. Multiple Regression: Random x's
10.1. Multivariate Normal Regression Model
10.2. Estimation and Testing in Multivariate Normal Regression
10.3. Standardized Regression Coefficients
10.4. R[superscript 2] in Multivariate Normal Regression
10.5. Tests and Confidence Intervals for R[superscript 2]
10.6. Effect of Each Variable on R[superscript 2]
10.7. Prediction for Multivariate Normal or Nonnormal Data
10.8. Sample Partial Correlations
11. Multiple Regression: Bayesian Inference
11.1. Elements of Bayesian Statistical Inference
11.2. A Bayesian Multiple Linear Regression Model
11.2.1. A Bayesian Multiple Regression Model with a Conjugate Prior
11.2.2. Marginal Posterior Density of [beta]
11.2.3. Marginal Posterior Densities of [tau] and [sigma superscript 2]
11.3. Inference in Bayesian Multiple Linear Regression
11.3.1. Bayesian Point and Interval Estimates of Regression Coefficients
11.3.2. Hypothesis Tests for Regression Coefficients in Bayesian Inference
11.3.3. Special Cases of Inference in Bayesian Multiple Regression Models
11.3.4. Bayesian Point and Interval Estimation of [sigma superscript 2]
11.4. Bayesian Inference through Markov Chain Monte Carlo Simulation
11.5. Posterior Predictive Inference
12. Analysis-of-Variance Models
12.1. Non-Full-Rank Models
12.1.1. One-Way Model
12.1.2. Two-Way Model
12.2. Estimation
12.2.1. Estimation of [beta]
12.2.2. Estimable Functions of [beta]
12.3. Estimators
12.3.1. Estimators of [lambda]'[beta]
12.3.2. Estimation of [sigma superscript 2]
12.3.3. Normal Model
12.4. Geometry of Least-Squares in the Overparameterized Model
12.5. Reparameterization
12.6. Side Conditions
12.7. Testing Hypotheses
12.7.1. Testable Hypotheses
12.7.2. Full-Reduced-Model Approach
12.7.3. General Linear Hypothesis
12.8. An Illustration of Estimation and Testing
12.8.1. Estimable Functions
12.8.2. Testing a Hypothesis
12.8.3. Orthogonality of Columns of X
13. One-Way Analysis-of-Variance: Balanced Case
13.1. The One-Way Model
13.2. Estimable Functions
13.3. Estimation of Parameters
13.3.1. Solving the Normal Equations
13.3.2. An Estimator for [sigma superscript 2]
13.4. Testing the Hypothesis H[subscript 0]: [mu subscript 1] = [mu subscript 2] = ... = [mu subscript k]
13.4.1. Full-Reduced-Model Approach
13.4.2. General Linear Hypothesis
13.5. Expected Mean Squares
13.5.1. Full-Reduced-Model Approach
13.5.2. General Linear Hypothesis
13.6. Contrasts
13.6.1. Hypothesis Test for a Contrast
13.6.2. Orthogonal Contrasts
13.6.3. Orthogonal Polynomial Contrasts
14. Two-Way Analysis-of-Variance: Balanced Case
14.1. The Two-Way Model
14.2. Estimable Functions
14.3. Estimators of [lambda]'[beta] and [sigma superscript 2]
14.3.1. Solving the Normal Equations and Estimating [lambda]'[beta]
14.3.2. An Estimator for [sigma superscript 2]
14.4. Testing Hypotheses
14.4.1. Test for Interaction
14.4.2. Tests for Main Effects
14.5. Expected Mean Squares
14.5.1. Sums-of-Squares Approach
14.5.2. Quadratic Form Approach
15. Analysis-of-Variance: The Cell Means Model for Unbalanced Data
15.1. Introduction
15.2. One-Way Model
15.2.1. Estimation and Testing
15.2.2. Contrasts
15.3. Two-Way Model
15.3.1. Unconstrained Model
15.3.2. Constrained Model
15.4. Two-Way Model with Empty Cells
16. Analysis-of-Covariance
16.1. Introduction
16.2. Estimation and Testing
16.2.1. The Analysis-of-Covariance Model
16.2.2. Estimation
16.2.3. Testing Hypotheses
16.3. One-Way Model with One Covariate
16.3.1. The Model
16.3.2. Estimation
16.3.3. Testing Hypotheses
16.4. Two-Way Model with One Covariate
16.4.1. Tests for Main Effects and Interactions
16.4.2. Test for Slope
16.4.3. Test for Homogeneity of Slopes
16.5. One-Way Model with Multiple Covariates
16.5.1. The Model
16.5.2. Estimation
16.5.3. Testing Hypotheses
16.6. Analysis-of-Covariance with Unbalanced Models
17. Linear Mixed Models
17.1. Introduction
17.2. The Linear Mixed Model
17.3. Examples
17.4. Estimation of Variance Components
17.5. Inference for [beta]
17.5.1. An Estimator for [beta]
17.5.2. Large-Sample Inference for Estimable Functions of [beta]
17.5.3. Small-Sample Inference for Estimable Functions of [beta]
17.6. Inference for the a[subscript i] Terms
17.7. Residual Diagnostics
18. Additional Models
18.1. Nonlinear Regression
18.2. Logistic Regression
18.3. Loglinear Models
18.4. Poisson Regression
18.5. Generalized Linear Models
Appendix A. Answers and Hints to the Problems
References
Index
Excerpt
Loading Excerpt...
Author Notes
Loading Author Notes...
More Details
Contributors
ISBN
9780471754985
Staff View
Loading Staff View.

