Applied Multiple Regression/Correlation Analysis for the Behavioral Sciences (3 HAR/CDR)

電子版価格 ¥13,480
  • 電書あり

Applied Multiple Regression/Correlation Analysis for the Behavioral Sciences (3 HAR/CDR)

  • 在庫がございません。海外の書籍取次会社を通じて出版社等からお取り寄せいたします。
    1. 納期遅延や、ご入手不能となる場合がございます。
    2. 複数冊ご注文の場合、分割発送となる場合がございます。
    3. 美品のご指定は承りかねます。

  • 提携先の海外書籍取次会社に在庫がございます。通常約2週間で発送いたします。
    1. 納期遅延や、ご入手不能となる場合が若干ございます。
    2. 複数冊ご注文の場合、分割発送となる場合がございます。
    3. 美品のご指定は承りかねます。
  • 製本 Hardcover:ハードカバー版/ページ数 1200 p.
  • 言語 ENG,ENG
  • 商品コード 9780805822236
  • DDC分類 519.536


With discussing such topics as alternative procedures, the potential problems of outliers, multilevel models for clustered data, and answering research questions using longitudinal data.

Full Description

This classic text on multiple regression is noted for its nonmathematical, applied, and data-analytic approach. Readers profit from its verbal-conceptual exposition and frequent use of examples. The applied emphasis provides clear illustrations of the principles and provides worked examples of the types of applications that are possible. Researchers learn how to specify regression models that directly address their research questions. An overview of the fundamental ideas of multiple regression and a review of bivariate correlation and regression and other elementary statistical concepts provide a strong foundation for understanding the rest of the text. The third edition features an increased emphasis on graphics and the use of confidence intervals and effect size measures, and an accompanying website with data for most of the numerical examples along with the computer code for SPSS, SAS, and SYSTAT, at . Applied Multiple Regression serves as both a textbook for graduate students and as a reference tool for researchers in psychology, education, health sciences, communications, business, sociology, political science, anthropology, and economics. An introductory knowledge of statistics is required. Self-standing chapters minimize the need for researchers to refer to previous chapters.

Table of Contents

Preface                                            xxv
Introduction 1 (18)
Multiple Regression/Correlation as a 1 (3)
General Data-Analytic System
Overview 1 (1)
Testing Hypotheses Using Multiple 2 (1)
Regression/Correlation: Some Examples
Multiple Regression/Correlation in 3 (1)
Prediction Models
A Comparison of Multiple 4 (2)
Regression/Correlation and Analysis of
Variance Approaches
Historical Background 4 (1)
Hypothesis Testing and Effect Sizes 5 (1)
Multiple Regression/Correlation and the 6 (4)
Complexity of Behavioral Science
Multiplicity of Influences 6 (1)
Correlation Among Research Factors and 6 (1)
Form of Information 7 (1)
Shape of Relationship 8 (1)
General and Conditional Relationships 9 (1)
Orientation of the Book 10 (4)
Nonmathematical 11 (1)
Applied 11 (1)
Data-Analytic 12 (1)
Inference Orientation and Specification 13 (1)
Computation, the Computer, and Numerical 14 (2)
Computation 14 (1)
Numerical Results: Reporting and 14 (1)
Significance Tests, Confidence 15 (1)
Intervals, and Appendix Tables
The Spectrum of Behavioral Science 16 (1)
Plan for the Book 16 (2)
Content 16 (1)
Structure: Numbering of Sections, 17 (1)
Tables, and Equations
Summary 18 (1)
Bivariate Correlation and Regression 19 (45)
Tabular and Graphic Representations of 19 (4)
The Index of Linear Correlation Between 23 (5)
Two Variables: The Pearson Product Moment
Correlation Coefficient
Standard Scores: Making Units Comparable 23 (3)
The Product Moment Correlation as a 26 (2)
Function of Differences Between z Scores
Alternative Formulas for the Product 28 (4)
Moment Correlation Coefficient
r as the Average Product of z Scores 28 (1)
Raw Score Formulas for r 29 (1)
Point Biserial r 29 (1)
Phi (φ) Coefficient 30 (1)
Rank Correlation 31 (1)
Regression Coefficients: Estimating Y 32 (4)
From X
Regression Toward the Mean 36 (1)
The Standard Error of Estimate and 37 (4)
Measures of the Strength of Association
Summary of Definitions and Interpretations 41 (1)
Statistical Inference With Regression and 41 (9)
Correlation Coefficients
Assumptions Underlying Statistical 41 (1)
Inference With Byx, B0, Yi, and rXY
Estimation With Confidence Intervals 42 (5)
Null Hypothesis Significance Tests 47 (3)
Confidence Limits and Null Hypothesis 50 (1)
Significance Testing
Precision and Power 50 (3)
Precision of Estimation 50 (1)
Power of Null Hypothesis Significance 51 (2)
Factors Affecting the Size of r 53 (9)
The Distributions of X and Y 53 (2)
The Reliability of the Variables 55 (2)
Restriction of Range 57 (2)
Part-Whole Correlations 59 (1)
Ratio or Index Variables 60 (2)
Curvilinear Relationships 62 (1)
Summary 62 (2)
Multiple Regression/Correlation With Two or 64 (37)
More Independent Variables
Introduction: Regression and Causal Models 64 (2)
What Is a Cause? 64 (1)
Diagrammatic Representation of Causal 65 (1)
Regression With Two Independent Variables 66 (3)
Measures of Association With Two 69 (6)
Independent Variables
Multiple R and R2 69 (3)
Semipartial Correlation Coefficients 72 (2)
and Increments to R2
Partial Correlation Coefficients 74 (1)
Patterns of Association Between Y and Two 75 (4)
Independent Variables
Direct and Indirect Effects 75 (1)
Partial Redundancy 76 (1)
Suppression in Regression Models 77 (1)
Spurious Effects and Entirely Indirect 78 (1)
Multiple Regression/Correlation With k 79 (7)
Independent Variables
Introduction: Components of the 79 (1)
Prediction Equation
Partial Regression Coefficients 80 (2)
R, R2, and Shrunken R2 82 (2)
sr and sr2 84 (1)
pr and pr2 85 (1)
Example of Interpretation of Partial 85 (1)
Statistical Inference With k Independent 86 (4)
Standard Errors and Confidence 86 (2)
Intervals for B and β
Confidence Intervals for R2 88 (1)
Confidence Intervals for Differences 88 (1)
Between Independent R2s
Statistical Tests on Multiple and 88 (2)
Partial Coefficients
Statistical Precision and Power Analysis 90 (5)
Introduction: Research Goals and the 90 (1)
Null Hypothesis
The Precision and Power of R2 91 (2)
Precision and Power Analysis for 93 (2)
Partial Coefficients
Using Multiple Regression Equations in 95 (4)
Prediction of Y for a New Observation 95 (1)
Correlation of Individual Variables 96 (1)
With Predicted Values
Cross-Validation and Unit Weighting 97 (1)
Multicollinearity 98 (1)
Summary 99 (2)
Data Visualization, Exploration, and 101(50)
Assumption Checking: Diagnosing and Solving
Regression Problems I
Introduction 101(1)
Some Useful Graphical Displays of the 102(15)
Original Data
Univariate Displays 103(7)
Bivariate Displays 110(5)
Correlation and Scatterplot Matrices 115(2)
Assumptions and Ordinary Least Squares 117(8)
Assumptions Underlying Multiple Linear 117(7)
Ordinary Least Squares Estimation 124(1)
Detecting Violations of Assumptions 125(16)
Form of the Relationship 125(2)
Omitted Independent Variables 127(2)
Measurement Error 129(1)
Homoscedasticity of Residuals 130(4)
Nonindependence of Residuals 134(3)
Normality of Residuals 137(4)
Remedies: Alternative Approaches When 141(9)
Problems Are Detected
Form of the Relationship 141(2)
Inclusion of All Relevant Independent 143(1)
Measurement Error in the Independent 144(1)
Nonconstant Variance 145(2)
Nonindependence of Residuals 147(3)
Summary 150(1)
Data-Analytic Strategies Using Multiple 151(42)
Research Questions Answered by 151(3)
Correlations and Their Squares
Net Contribution to Prediction 152(1)
Indices of Differential Validity 152(1)
Comparisons of Predictive Utility 152(1)
Attribution of a Fraction of the XY 153(1)
Relationship to a Third Variable
Which of Two Variables Accounts for 153(1)
More of the XY Relationship?
Are the Various Squared Correlations in 154(1)
One Population Different From Those in
Another Given the Same Variables?
Research Questions Answered by B or β 154(4)
Regression Coefficients as Reflections 154(1)
of Causal Effects
Alternative Approaches to Making BYX 154(3)
Substantively Meaningful
Are the Effects of a Set of Independent 157(1)
Variables on Two Different Outcomes in
a Sample Different?
What Are the Reciprocal Effects of Two 157(1)
Variables on One Another?
Hierarchical Analysis Variables in 158(4)
Multiple Regression/Correlation
Causal Priority and the Removal of 158(2)
Confounding Variables
Research Relevance 160(1)
Examination of Alternative Hierarchical 160(1)
Sequences of Independent Variables Sets
Stepwise Regression 161(1)
The Analysis of Sets of Independent 162(9)
Types of Sets 162(2)
The Simultaneous and Hierarchical 164(2)
Analyses of Sets
Variance Proportions for Sets and the 166(3)
Ballantine Again
B and β Coefficients for Variables 169(2)
Within Sets
Significance Testing for Sets 171(5)
Application in Hierarchical Analysis 172(1)
Application in Simultaneous Analysis 173(1)
Using Computer Output to Determine 174(1)
Statistical Significance
An Alternative F Test: Using Model 2 174(2)
Error Estimate From the Final Model
Power Analysis for Sets 176(6)
Determining n* for the F Test of sR2B 177(2)
with Model 1 or Model 2 Error
Estimating the Population sR2 Values 179(1)
Setting Power for n* 180(1)
Reconciling Different n*s 180(1)
Power as a Function of n 181(1)
Tactics of Power Analysis 182(1)
Statistical Inference Strategy in 182(8)
Multiple Regression/Correlation
Controlling and Balancing Type I and 182(3)
Type II Errors in Inference
Less Is More 185(1)
Least Is Last 186(1)
Adaptation of Fisher's Protected t Test 187(3)
Statistical Inference and the Stage of 190(1)
Scientific Investigations
Summary 190(3)
Quantitative Scales, Curvilinear 193(62)
Relationships, and Transformations
Introduction 193(3)
What Do We Mean by Linear Regression? 193(1)
Linearity in the Variables and Linear 194(1)
Multiple Regression
Four Approaches to Examining Nonlinear 195(1)
Relationships in Multiple Regression
Power Polynomials 196(18)
Method 196(2)
An Example: Quadratic Fit 198(3)
Centering Predictors in Polynomial 201(3)
Relationship of Test of Significance of 204(1)
Highest Order Coefficient and Gain in
Interpreting Polynomial Regression 205(2)
Another Example: A Cubic Fit 207(2)
Strategy and Limitations 209(4)
More Complex Equations 213(1)
Orthogonal Polynomials 214(7)
The Cubic Example Revisited 216(3)
Unequal n and Unequal Intervals 219(1)
Applications and Discussion 220(1)
Nonlinear Transformations 221(30)
Purposes of Transformation and the 221(2)
Nature of Transformations
The Conceptual Basis of Transformations 223(1)
and Model Checking Before and After
Transformation---Is It Always Ideal to
Logarithms and Exponents; Additive and 223(2)
Proportional Relationships
Linearizing Relationships 225(2)
Linearizing Relationships Based on 227(5)
Strong Theoretical Models
Linearizing Relationships Based on Weak 232(1)
Theoretical Models
Empirically Driven Transformations in 233(1)
the Absence of Strong or Weak Models
Empirically Driven Transformation for 233(3)
Linearization: The Ladder of
Re-expression and the Bulging Rule
Empirically Driven Transformation for 236(3)
Linearization in the Absence of Models:
Box-Cox Family of Power Transformations
on Y
Empirically Driven Transformation for 239(1)
Linearization in the Absence of Models:
Box-Tidwell Family of Power
Transformations on X
Linearization of Relationships With 240(1)
Correlations: Fisher z Transform of r
Transformations That Linearize 240(4)
Relationships for Counts and Proportions
Variance Stabilizing Transformations 244(2)
and Alternatives for Treatment of
Transformations to Normalize Variables 246(1)
Diagnostics Following Transformation 247(1)
Measuring and Comparing Model Fit 248(1)
Second-Order Polynomial Numerical 248(1)
Example Revisited
When to Transform and the Choice of 249(2)
Nonlinear Regression 251(1)
Nonparametric Regression 252(1)
Summary 253(2)
Interactions Among Continuous Variables 255(47)
Introduction 255(6)
Interactions Versus Additive Effects 256(3)
Conditional First-Order Effects in 259(2)
Equations Containing Interactions
Centering Predictors and the 261(6)
Interpretation of Regression Coefficients
in Equations Containing Interactions
Regression with Centered Predictors 261(1)
Relationship Between Regression 262(1)
Coefficients in the Uncentered and
Centered Equations
Centered Equations With No Interaction 262(2)
Essential Versus Nonessential 264(1)
Centered Equations With Interactions 264(2)
The Highest Order Interaction in the 266(1)
Centered Versus Uncentered Equation
Do Not Center Y 266(1)
A Recommendation for Centering 266(1)
Simple Regression Equations and Simple 267(5)
Plotting Interactions 269(1)
Moderator Variables 269(1)
Simple Regression Equations 269(1)
Overall Regression Coefficient and 270(1)
Simple Slope at the Mean
Simple Slopes From Uncentered Versus 271(1)
Centered Equations Are Identical
Linear by Linear Interactions 271(1)
Interpreting Interactions in Multiple 272(1)
Regression and Analysis of Variance
Post Hoc Probing of Interactions 272(10)
Standard Error of Simple Slopes 272(1)
Equation Dependence of Simple Slopes 273(1)
and Their Standard Errors
Tests of Significance of Simple Slopes 273(1)
Confidence Intervals Around Simple 274(1)
A Numerical Example 275(6)
The Uncentered Regression Equation 281(1)
First-Order Coefficients in Equations 281(1)
Without and With Interactions
Interpretation and the Range of Data 282(1)
Standardized Estimates for Equations 282(2)
Containing Interactions
Interactions as Partialed Effects: 284(1)
Building Regression Equations With
Patterns of First-Order and Interactive 285(5)
Three Theoretically Meaningful Patterns 285(1)
of First-Order and Interaction Effects
Ordinal Versus Disordinal Interactions 286(4)
Three-Predictor Interactions in Multiple 290(2)
Curvilinear by Linear Interactions 292(3)
Interactions Among Sets of Variables 295(2)
Issues in the Detection of Interactions: 297(3)
Reliability, Predictor Distributions,
Model Specification
Variable Reliability and Power to 297(1)
Detect Interactions
Sampling Designs to Enhance Power to 298(1)
Detect Interactions---Optimal Design
Difficulty in Distinguishing 299(1)
Interactions Versus Curvilinear Effects
Summary 300(2)
Categorical or Nominal Independent Variables 302(52)
Introduction 302(1)
Categories as a Set of Independent 302(1)
The Representation of Categories or 302(1)
Nominal Scales
Dummy-Variable Coding 303(17)
Coding the Groups 303(5)
Pearson Correlations of Dummy Variables 308(3)
With Y
Correlations Among Dummy-Coded Variables 311(1)
Multiple Correlation of the 311(1)
Dummy-Variable Set With Y
Regression Coefficients for Dummy 312(4)
Partial and Semipartial Correlations 316(1)
for Dummy Variables
Dummy-Variable Multiple 317(2)
Regression/Correlation and One-Way
Analysis of Variance
A Cautionary Note: Dummy-Variable-Like 319(1)
Coding Systems
Dummy-Variable Coding When Groups Are 320(1)
Not Mutually Exclusive
Unweighted Effects Coding 320(8)
Introduction: Unweighted and Weighted 320(1)
Effects Coding
Constructing Unweighted Effects Codes 321(3)
The R2 and the ryiS for Unweighted 324(1)
Effects Codes
Regression Coefficients and Other 325(3)
Partial Effects in Unweighted Code Sets
Weighted Effects Coding 328(4)
Selection Considerations for Weighted 328(1)
Effects Coding
Constructing Weighted Effects 328(2)
The R2 and R2 for Weighted Effects Codes 330(1)
Interpretation and Testing of B With 331(1)
Unweighted Codes
Contrast Coding 332(9)
Considerations in the Selection of a 332(1)
Contrast Coding Scheme
Constructing Contrast Codes 333(4)
The R2 and R2 337(1)
Partial Regression Coefficients 337(3)
Statistical Power and the Choice of 340(1)
Contrast Codes
Nonsense Coding 341(1)
Coding Schemes in the Context of Other 342(9)
Independent Variables
Combining Nominal and Continuous 342(1)
Independent Variables
Calculating Adjusted Means for Nominal 343(1)
Independent Variables
Adjusted Means for Combinations of 344(4)
Nominal and Quantitative Independent
Adjusted Means for More Than Two Groups 348(2)
and Alternative Coding Methods
Multiple Regression/Correlation With 350(1)
Nominal Independent Variables and the
Analysis of Covariance
Summary 351(3)
Interactions With Categorical Variables 354(36)
Nominal Scale by Nominal Scale 354(12)
The 2 by 2 Design 354(7)
Regression Analyses of Multiple Sets of 361(5)
Nominal Variables With More Than Two
Interactions Involving More Than Two 366(9)
Nominal Scales
An Example of Three Nominal Scales 367(5)
Coded by Alternative Methods
Interactions Among Nominal Scales in 372(1)
Which Not All Combinations Are
What If the Categories for One or More 373(1)
Nominal ``Scales'' Are Not Mutually
Consideration of pr, β, and 374(1)
Variance Proportions for Nominal Scale
Interaction Variables
Summary of Issues and Recommendations 374(1)
for Interactions Among Nominal Scales
Nominal Scale by Continuous Variable 375(13)
A Reminder on Centering 375(1)
Interactions of a Continuous Variable 375(3)
With Dummy-Variable Coded Groups
Interactions Using Weighted or 378(1)
Unweighted Effects Codes
Interactions With a Contrast-Coded 379(1)
Nominal Scale
Interactions Coded to Estimate Simple 380(3)
Slopes of Groups
Categorical Variable Interactions With 383(3)
Nonlinear Effects of Scaled Independent
Interactions of a Scale With Two or 386(2)
More Categorical Variables
Summary 388(2)
Outliers and Multicollinearity: Diagnosing 390(41)
and Solving Regression Problems II
Introduction 390(1)
Outliers: Introduction and Illustration 391(3)
Detecting Outliers: Regression Diagnostics 394(17)
Extremity on the Independent Variables: 394(4)
Extremity on Y: Discrepancy 398(4)
Influence on the Regression Estimates 402(4)
Location of Outlying Points and 406(3)
Diagnostic Statistics
Summary and Suggestions 409(2)
Sources of Outliers and Possible Remedial 411(8)
Sources of Outliers 411(4)
Remedial Actions 415(4)
Multicollinearity 419(6)
Exact Collinearity 419(1)
Multicollinearity: A Numerical 420(2)
Measures of the Degree of 422(3)
Remedies for Multicollinearity 425(5)
Model Respecification 426(1)
Collection of Additional Data 427(1)
Ridge Regression 427(1)
Principal Components Regression 428(1)
Summary of Multicollinearity 429(1)
Summary 430(1)
Missing Data 431(21)
Basic Issues in Handling Missing Data 431(4)
Minimize Missing Data 431(1)
Types of Missing Data 432(1)
Traditional Approaches to Missing Data 433(2)
Missing Data in Nominal Scales 435(7)
Coding Nominal Scale X for Missing Data 435(4)
Missing Data on Two Dichotomies 439(1)
Estimation Using the EM Algorithm 440(2)
Missing Data in Quantitative Scales 442(8)
Available Alternatives 442(2)
Imputation of Values for Missing Cases 444(3)
Modeling Solutions to Missing Data in 447(1)
Scaled Variables
An Illustrative Comparison of 447(3)
Alternative Methods
Rules of Thumb 450(1)
Summary 450(2)
Multiple Regression/Correlation and Causal 452(27)
Introduction 452(8)
Limits on the Current Discussion and 452(2)
the Relationship Between Causal
Analysis and Analysis of Covariance
Theories and Multiple 454(3)
Regression/Correlation Models That
Estimate and Test Them
Kinds of Variables in Causal Models 457(2)
Regression Models as Causal Models 459(1)
Models Without Reciprocal Causation 460(7)
Direct and Indirect Effects 460(4)
Path Analysis and Path Coefficients 464(1)
Hierarchical Analysis and Reduced Form 465(1)
Partial Causal Models and the 466(1)
Hierarchical Analysis of Sets
Testing Model Elements 467(1)
Models With Reciprocal Causation 467(1)
Identification and Overidentification 468(1)
Just Identified Models 468(1)
Overidentification 468(1)
Underidentification 469(1)
Latent Variable Models 469(6)
An Example of a Latent Variable Model 469(2)
How Latent Variables Are Estimated 471(1)
Fixed and Free Estimates in Latent 472(1)
Variable Models
Goodness-of-Fit Tests of Latent 472(1)
Variable Models
Latent Variable Models and the 473(1)
Correction for Attenuation
Characteristics of Data Sets That Make 474(1)
Latent Variable Analysis the Method of
A Review of Causal Model and Statistical 475(1)
Specification Error 475(1)
Identification Error 475(1)
Comparisons of Causal Models 476(1)
Nested Models 476(1)
Longitudinal Data in Causal Models 476(1)
Summary 477(2)
Alternative Regression Models: Logistic, 479(57)
Poisson Regression, and the Generalized
Linear Model
Ordinary Least Squares Regression 479(3)
Three Characteristics of Ordinary Least 480(1)
Squares Regression
The Generalized Linear Model 480(1)
Relationship of Dichotomous and Count 481(1)
Dependent Variables Y to a Predictor
Dichotomous Outcomes and Logistic 482(37)
Extending Linear Regression: The Linear 483(2)
Probability Model and Discriminant
The Nonlinear Transformation From 485(1)
Predictor to Predicted Scores: Probit
and Logistic Transformation
The Logistic Regression Equation 486(1)
Numerical Example: Three Forms of the 487(5)
Logistic Regression Equation
Understanding the Coefficients for the 492(1)
Predictor in Logistic Regression
Multiple Logistic Regression 493(1)
Numerical Example 494(3)
Confidence Intervals on Regression 497(1)
Coefficients and Odds Ratios
Estimation of the Regression Model: 498(1)
Maximum Likelihood
Deviances: Indices of Overall Fit of 499(3)
the Logistic Regression Model
Multiple R2 Analogs in Logistic 502(2)
Testing Significance of Overall Model 504(3)
Fit: The Likelihood Ratio Test and the
Test of Model Deviance
Χ2 Test for the Significance of a 507(1)
Single Predictor in a Multiple Logistic
Regression Equation
Hierarchical Logistic Regression: 508(1)
Likelihood Ratio Χ2 Test for the
Significance of a Set of Predictors
Above and Beyond Another Set
Akaike's Information Criterion and the 509(1)
Bayesian Information Criterion for
Model Comparison
Some Treachery in Variable Scaling and 509(3)
Interpretation of the Odds Ratio
Regression Diagnostics in Logistic 512(4)
Sparseness of Data 516(1)
Classification of Cases 516(3)
Extensions of Logistic Regression to 519(6)
Multiple Response Categories: Polytomous
Logistic Regression and Ordinal Logistic
Polytomous Logistic Regression 519(1)
Nested Dichotomies 520(2)
Ordinal Logistic Regression 522(3)
Models for Count Data: Poisson Regression 525(7)
and Alternatives
Linear Regression Applied to Count Data 525(1)
Poisson Probability Distribution 526(2)
Poisson Regression Analysis 528(2)
Overdispersion and Alternative Models 530(2)
Independence of Observations 532(1)
Sources on Poisson Regression 532(1)
Full Circle: Parallels Between Logistic 532(3)
and Poisson Regression, and the
Generalized Linear Model
Parallels Between Poisson and Logistic 532(2)
The Generalized Linear Model Revisited 534(1)
Summary 535(1)
Random Coefficient Regression and 536(32)
Multilevel Models
Clustering Within Data Sets 536(3)
Clustering, Alpha Inflation, and the 537(1)
Intraclass Correlation
Estimating the Intraclass Correlation 538(1)
Analysis of Clustered Data With Ordinary 539(4)
Least Squares Approaches
Numerical Example, Analysis of 541(2)
Clustered Data With Ordinary Least
Squares Regression
The Random Coefficient Regression Model 543(1)
Random Coefficient Regression Model and 544(6)
Multilevel Data Structure
Ordinary Least Squares (Fixed Effects) 544(1)
Regression Revisited
Fixed and Random Variables 544(1)
Clustering and Hierarchically 545(1)
Structured Data
Structure of the Random Coefficient 545(1)
Regression Model
Level 1 Equations 546(1)
Level 2 Equations 547(1)
Mixed Model Equation for Random 548(1)
Coefficient Regression
Variance Components---New Parameters in 548(1)
the Multilevel Model
Variance Components and Random 549(1)
Coefficient Versus Ordinary Least
Squares (Fixed Effects) Regression
Parameters of the Random Coefficient 550(1)
Regression Model: Fixed and Random
Numerical Example: Analysis of Clustered 550(3)
Data With Random Coefficient Regression
Unconditional Cell Means Model and the 551(1)
Intraclass Correlation
Testing the Fixed and Random Parts of 552(1)
the Random Coefficient Regression Model
Clustering as a Meaningful Aspect of the 553(1)
Multilevel Modeling With a Predictor at 553(2)
Level 2
Level 1 Equations 553(1)
Revised Level 2 Equations 554(1)
Mixed Model Equation With Level 1 554(1)
Predictor and Level 2 Predictor of
Intercept and Slope and the Cross-Level
An Experimental Design as a Multilevel 555(1)
Data Structure: Combining Experimental
Manipulation With Individual Differences
Numerical Example: Multilevel Analysis 556(4)
Estimation of the Multilevel Model 560(3)
Parameters: Fixed Effects, Variance
Components, and Level 1 Equations
Fixed Effects and Variance Components 560(1)
An Equation for Each Group: Empirical 560(3)
Bayes Estimates of Level 1 Coefficients
Statistical Tests in Multilevel Models 563(1)
Fixed Effects 563(1)
Variance Components 563(1)
Some Model Specification Issues 564(1)
The Same Variable at Two Levels 564(1)
Centering in Multilevel Models 564(1)
Statistical Power of Multilevel Models 565(1)
Choosing Between the Fixed Effects Model 565(1)
and the Random Coefficient Model
Sources on Multilevel Modeling 566(1)
Multilevel Models Applied to Repeated 566(1)
Measures Data
Summary 567(1)
Longitudinal Regression Methods 568(40)
Introduction 568(1)
Chapter Goals 568(1)
Purposes of Gathering Data on Multiple 569(1)
Analyses of Two-Time-Point Data 569(4)
Change or Regressed Change? 570(1)
Alternative Regression Models for 571(2)
Effects Over a Single Unit of Time
Three- or Four-Time-Point Data 573(1)
Repeated Measure Analysis of Variance 573(5)
Multiple Error Terms in Repeated 574(1)
Measure Analysis of Variance
Trend Analysis in Analysis of Variance 575(1)
Repeated Measure Analysis of Variance 576(2)
in Which Time Is Not the Issue
Multilevel Regression of Individual 578(10)
Changes Over Time
Patterns of Individual Change Over Time 578(4)
Adding Other Fixed Predictors to the 582(1)
Individual Differences in Variation 583(1)
Around Individual Slopes
Alternative Developmental Models and 584(2)
Error Structures
Alternative Link Functions for 586(1)
Predicting Y From Time
Unbalanced Data: Variable Timing and 587(1)
Missing Data
Latent Growth Models: Structural Equation 588(7)
Model Representation of Multilevel Data
Estimation of Changes in True Scores 589(1)
Representation of Latent Growth Models 589(5)
in Structural Equation Model Diagrams
Comparison of Multilevel Regression and 594(1)
Structural Equation Model Analysis of
Time Varying Independent Variables 595(1)
Survival Analysis 596(4)
Regression Analysis of Time Until 596(3)
Outcome and the Problem of Censoring
Extension to Time-Varying Independent 599(1)
Extension to Multiple Episode Data 599(1)
Extension to a Categorical Outcome: 600(1)
Event-History Analysis
Time Series Analysis 600(2)
Units of Observation in Time Series 601(1)
Time Series Analyses Applications 601(1)
Time Effects in Time Series 602(1)
Extension of Time Series Analyses to 602(1)
Multiple Units or Subjects
Dynamic System Analysis 602(2)
Statistical Inference and Power Analysis 604(1)
in Longitudinal Analyses
Summary 605(3)
Multiple Dependent Variables: Set 608(35)
Introduction to Ordinary Least Squares 608(2)
Treatment of Multiple Dependent Variables
Set Correlation Analysis 608(1)
Canonical Analysis 609(1)
Elements of Set Correlation 610(1)
Measures of Multivariate Association 610(3)
R2Y, X, the Proportion of Generalized 610(1)
T2Y, X and P2Y, X, Proportions of 611(2)
Additive Variance
Partialing in Set Correlation 613(2)
Frequent Reasons for Partialing 613(1)
Variable Sets From the Basic Sets
The Five Types of Association Between 614(1)
Basic Y and X Sets
Tests of Statistical Significance and 615(2)
Statistical Power
Testing the Null Hypothesis 615(1)
Estimators of the Population R2Y, X, 616(1)
T2Y, X, and P2Y, X
Guarding Against Type I Error Inflation 617(1)
Statistical Power Analysis in Set 617(2)
Comparison of Set Correlation With 619(1)
Multiple Analysis of Variance
New Analytic Possibilities With Set 620(1)
Illustrative Examples 621(6)
A Simple Whole Association 621(1)
A Multivariate Analysis of Partial 622(1)
A Hierarchical Analysis of a 623(2)
Quantitative Set and Its Unique
Bipartial Association Among Three Sets 625(2)
Summary 627(4)
Appendix 1: The Mathematical Basis for 631(5)
Multiple Regression/Correlation and
Identification of the Inverse Matrix
A1.1 Alternative Matrix Methods 634(1)
A1.2 Determinants 634(2)
Appendix 2: Determination of the Inverse 636(7)
Matrix and Applications Thereof
A2.1 Hand Calculation of the Multiple 636(4)
Regression/Correlation Problem
A2.2 Testing the Difference Between 640(2)
Partial βs and Bs From the Same
A2.3 Testing the Difference Between 642(1)
βs for Different Dependent
Variables From a Single Sample
Appendix Tables 643(12)
Table A t Values for α = .01, .05 643(1)
(Two Tailed)
Table B z' Transformation of r 644(1)
Table C Normal Distribution 645(1)
Table D F Values for α = .01, .05 646(4)
Table E L Values for α = .01, .05 650(2)
Table F Power of Significance Test of r 652(2)
at α = .01, .05 (Two Tailed)
Table G n* to Detect r by t Test at 654(1)
α = .01, .05 (Two Tailed)
References 655(16)
Glossary 671(12)
Statistical Symbols and Abbreviations 683(4)
Author Index 687(4)
Subject Index 691