1 Introduction and Preview.- Overview.- Multivariate Analysis: A Broad Definition.- Multivariate Analysis: A Narrow Definition.- Some Important Themes.- Obtaining Meaningful Relations.- Selecting Cutoffs.- Questions of Statistical Inference.- Outliers.- The Importance of Theory.- Problems Peculiar to the Analysis of Scales.- The Role of Computers in Multivariate Analysis.- Multivariate Analysis and the Personal Computer.- Choosing a Computer Package.- Problems in the Use of Computer Packages.- The Importance of Matrix Procedures.- 2 Some Basic Statistical Concepts.- Overview.- Univariate Data Analysis.- Frequency Distributions.- Normal Distributions.- Standard Normal Distributions.- Parameters and Statistics.- Locational Parameters and Statistics.- Measures of Variability.- A Note on Estimation.- Binary Data and the Binomial Distribution.- Data Transformation.- Bivariate Data Analysis.- Characteristics of Bivariate Relationships.- Bivariate Normality.- Measures of Bivariate Relation.- Range Restriction.- Pearson Correlation Formulas in Special Cases.- Non-Pearson Estimates of Pearson Correlations.- The Eta-Square Measure.- Phi Coefficients with Unequal Probabilities.- Sampling Error of a Correlation.- The Z' Transformation.- Linear Regression.- The Geometry of Regression.- Raw-Score Formulas for the Slope.- Raw-Score Formulas for the Intercept.- Residuals.- The Standard Error of Estimate.- Why the Term "Regression"?.- A Summary of Some Basic Relations.- Statistical Control: A First Look at Multivariate Relations.- Partial and Part Correlation.- Statistical versus Experimental Control.- Multiple Partialling.- Within-Group, Between-Group, and Total Correlations.- 3 Some Matrix Concepts.- Overview.- Basic Definitions.- Square Matrices.- Transposition.- Matrix Equality.- Basic Matrix Operations.- Matrix Addition and Subtraction.- Matrix Multiplication.- Correlation Matrices and Matrix Multiplication.- Partitioned Matrices and Their Multiplication.- Some Rules and Theorems Involved in Matrix Algebra.- Products of Symmetric Matrices.- More about Vector Products.- Exponentiation.- Determinants.- Matrix Singularity and Linear Dependency.- Matrix Rank.- Matrix "Division".- The Inverse of a 2 × 2 Matrix.- Inverses of Higher-Order Matrices.- Recalculation of an Inverse Following Deletion of Variable(s).- An Application of Matrix Algebra.- More about Linear Combinations.- The Mean of a Linear Combination.- The Variance of a Linear Combination.- Covariances between Linear Combination.- The Correlation between Two Different Linear Combinations.- Correlations between Linear Combinations and Matrix Notation.- The "It Don't Make No Nevermind" Principle.- Eigenvalues and Eigenvectors.- A Simple Eigenanalysis.- Eigenanalysis of Gramian Matrices.- 4 Multiple Regression and Correlation-Part 1. Basic concepts.- Overview.- Assumptions Underlying Multiple Regression.- The Multivariate Normal Distribution.- A Bit of the Geometry of Multiple Regression.- Basic Goals of Regression Analysis.- The Case of Two Predictors.- A Visual Example.- A Note on Suppressor Variables.- Computational Formulas.- Raw-Score Formulas.- Other Equations for R2.- Determining the Relative Importance of the Two Predictors.- Bias in Multiple Correlation.- The Case of More Than Two Predictors.- Checking for Multicollinearity.- Another Way to Obtain R2.- Residuals.- Inferential Tests.- Testing R.- Testing Beta Weights.- Testing the Uniqueness of Predictors.- Evaluating Alternative Equations.- Cross Validation.- Computing a Correlation from a priori Weights.- Testing the Difference between R2 and r2 Derived from a priori Weights.- Hierarchical Inclusion of Predictors.- Stepwise Inclusion of Predictors.- Other Ways to Handle Multicollinearity.- Comparing Alternative Equations.- Example 1-Perfect Prediction.- Example 2-Imperfect Prediction plus a Look at Residuals.- Example 3-Real Personality Assessment Data.- Alternative Approaches to Data Aggregation.- 5 Multiple Regression and
Like most academic authors, my views are a joint product of my teaching and my research. Needless to say, my views reflect the biases that I have acquired. One way to articulate the rationale (and limitations) of my biases is through the preface of a truly great text of a previous era, Cooley and Lohnes (1971, p. v). They draw a distinction between mathematical statisticians whose intel lect gave birth to the field of multivariate analysis, such as Hotelling, Bartlett, and Wilks, and those who chose to "concentrate much of their attention on methods of analyzing data in the sciences and of interpreting the results of statistical analysis . . . . (and) . . . who are more interested in the sciences than in mathematics, among other characteristics. " I find the distinction between individuals who are temperamentally "mathe maticians" (whom philosophy students might call "Platonists") and "scientists" ("Aristotelians") useful as long as it is not pushed to the point where one assumes "mathematicians" completely disdain data and "scientists" are never interested in contributing to the mathematical foundations of their discipline. I certainly feel more comfortable attempting to contribute in the "scientist" rather than the "mathematician" role. As a consequence, this book is primarily written for individuals concerned with data analysis. However, as noted in Chapter 1, true expertise demands familiarity with both traditions.
Springer Book Archives