PSYC 7431 Behavioral Objectives

PSYC 7431 Behavioral Objectives

Page 1

East Carolina University
Department of Psychology

Behavioral Objectives for PSYC 7431 and PSYC 7433

  1. For a twofactor ANOVA, describe the simple main effects of one factor as being a partitioning of other effects (sums of squares) computed in the omnibus analysis (in the source table).
  2. Describe the reversal paradox (Messick & van de Geer, Psychological Bulletin, 90, 582593), using an example with hypothetical data (a 2 x 2 table of sample sizes with means or frequencies will do).
  3. Discuss three general types of solutions to the problem of unequal sample sizes in a twofactor independent samples ANOVA.
  4. Define and give an example of a threeway interaction and explain how one would continue analysis after finding a significant threeway interaction in an ANOVA.
  5. Explain the distinction between crossed and nested factors using subjects in a twofactor mixed (amongwithin) design ANOVA as an example.
  6. Explain why having an interaction mean square in the denominator of an F ratio is not generally desirable and how one could change such an undesirable situation when testing main effects in a mixed or random effects factorial ANOVA.
  7. Discuss the differences between parametric and nonparametric inferential procedures. Contrast the hypotheses they test and the assumptions they make. Include Wilcoxon’s rank tests (which you should be able to conduct without reference to a textbook) in your discussion. Assume you are using them to detect effects upon the location of the dependent variable.
  8. Describe what happens to the sampling distributions for standard nonparametric tests when the sample size gets large.
  9. Give an example of data that could be analyzed with either Kendall’s coefficient of concordance or with a Friedman test. Explain how these two mathematically identical procedures look at the same data from different perspectives.
  10. Describe how resampling statistics differ from traditional inferential statistics. Explain how bootstrapping differs from randomization/permutation tests.
  11. For each of the below statistical techniques: Identify the type of research question asked, the goal of the analysis, and the number of allowable covariates and dependent and independent variables. Discuss the theoretical limitations (underlying assumptions) and comment on how the assumptions may be tested, how robust they are, and how violations may be avoided or corrected. Discuss any practical limitations. Give (hypothetical or real) examples of research which could be analyzed with the technique. Given an example of research, be able to choose an appropriate technique. Given computer output, be able to interpret the results and write a results section. Given a set of data, analyze it with statistical packages available on our computers.
  1. Canonical correlation/regression
  2. Categorical data analysis: Log-linear models
  3. Cluster analysis
  4. Discriminant function analysis
  5. Hierarchical Linear Modeling
  6. Least squares ANOVA and analysis of covariance
  7. Logistic regression
  8. MANOVA
  9. Multiple correlation/regression, including mediation and moderation models
  10. Multivariate approach to repeated measures ANOVA
  11. Path analysis
  12. Principal components and factor analysis
  13. Structural equation modeling
  1. Be able to screen data to detect missing scores, outliers, and out-of-range scores and discuss what steps should be taken to deal with these problems.
  2. Explain how to determine whether or not a set of data is approximately normally distributed and how to deal with data that are not approximately distributed (including data transformations).
  3. Explain how to determine whether or not a set of data should be analyzed with a procedure which requires homogeneity of variances and describe how to deal with the problem of heterogeneity of variance.
  4. Be able to conduct and interpret curvilinear regression analysis, including polynomial regression analysis.
  5. Be able to do simple matrix algebra, including adding, subtracting, multiplying, and inverting matrices.
  6. Define multicollinearity, explain why this can be a problem, and describe how to deal with it when it is a problem.
  7. Singular intercorrelation matrices, those which have no inverse, will prevent the completion of a multiple regression analysis. List the circumstances under which such matrices are obtained.
  8. Write the general equation for a bivariate linear regression, identify each of its terms, and describe how the regression line and the data from which it was generated can be represented in a twodimensional space (using a Cartesian coordinate system). Do the same for a trivariate regression (two predictors) in three dimensional space, describing the regression plane. Now, off into hyperspace. Do the same for a regression with p (p > 2) predictors.
  9. Define R, the multiple correlation coefficient, as a bivariate correlation.
  10. Discuss the relationship among R (the multiple correlation coefficient), N (the number of observations), and p (the number of predictors) and how this affects estimation of the strength of linear association (R*) in the population from which the sample data were drawn.
  11. Discuss how one would and why one would want to standardize multiple regression coefficients.
  12. Describe the reversal paradox in terms of a trivariate linear regression.
  13. Discuss why multiple linear regression coefficients are sometimes referred to as partial regression coefficients.
  14. Describe both partial and semipartial correlation coefficients in terms of bivariate correlations. Use Venn diagrams to distinguish between the two.
  15. For a p (p > 2) predictor multiple linear regression, describe the relationship between the squared multiple correlation coefficient and a set of (p1) semipartial correlations plus one nonpartialled correlation. Hint: approach the problem as a sequential regression.
  16. Give an example of a trivariate linear regression (Y, X1, X2) in which X2 functions as a suppressor variable (classical suppression would be the most simple example). Use a Venn diagram to explain the effect of the suppressor variable.
  17. Be able to detect suppressor effects and classify them as classical, net, or cooperative.
  18. Be able to use diagnostic procedures to evaluate the leverage, distance, and influence of particular observations in a regression analysis.
  19. Describe stepwise or hierarchical multiple regression and comment on how one determines the number of predictors to be entered and the order in which they are to be entered.
  20. Describe how one could crossvalidate the multiple regression analysis obtained from a very large data set.
  21. Be able to determine whether the regression line for predicting Y for one population is coincident with the regression line for predicting Y for a second population (including tests of homogeneity of slopes, and homogeneity of intercepts).
  22. Be able to conduct and interpret the results of logistic regression analysis.
  23. Explain how a oneway ANOVA can be computed using multiple linear regression analysis.
  24. Explain the use of dummy variable coding and reduced models to partition the regression sum of squares from a multiple linear regression analysis into the A, B, and AxB sums of squares that would be obtained by the traditional ANOVA.
  25. Use a Venn diagram to picture the nonorthogonality of the effects of A, B, and AxB produced when a twoway ANOVA has unequal sample sizes. Shade in those areas representing the variances estimated by the Method I (Overall & Spiegel, Psychological Bulletin, 72, 311) least squares partitioning of the sums of squares. Name the traditional computational approach which approximates this least squares analysis.
  26. List and briefly discuss the assumptions of the analysis of covariance, including how these assumptions can be tested.
  27. Give hypothetical examples of and discuss the advantages/limitations and interpretation of an analysis of covariance when: a.) the treatment groups' means on the covariate all equal one another, or nearly so; b.) the treatments have different covariate means, but the treatment has no effect on the covariate; c.) the treatment directly affects the covariate.
  28. Discuss the computation, meaning, and uses of adjusted means in an analysis of covariance.
  29. Discuss the meaning of the assumption of compound symmetry in the population covariance matrix for a onefactor withinsubjects (randomized blocks) ANOVA and tell how this assumption is simplified in the onefactor independent samples analysis.
  30. Explain why subjects is not an effect often tested in a onefactor withinsubjects design and state the circumstances under which such an effect could be tested via ANOVA.
  31. Be able to conduct and interpret factorial ANOVA including repeated measures or randomized blocks factors, including both the univariate and multivariate approaches and doubly multivariate analysis.
  32. Explain how Bayes theorem is used to compute posterior probabilities from prior probabilities and likelihoods. Describe how the Bayesian approach to the testing of hypotheses differs from the traditional approach.
  33. Be able to conduct and interpret discriminant function analysis.
  34. Describe common uses of principal components analysis (PCA) and factor analysis (FA).
  35. Be able to conduct and interpret PCA and FA.
  36. Describe how one should decide how many components or factors to retain in a PCA or FA.
  37. Explain how a rotated solution differs from an unrotated solution in PCA and FA and discriminate between orthogonal and oblique rotations.
  38. Explain the primary difference between PCA and FA.
  39. Explain the differences between exact factor scores and unit-weighted factor scores and discuss the merits of each.
  40. Describe how one can compare the factor structure in one population with that in another population.
  41. Discuss the factors involved in determining how many subjects are needed to conduct a PCA or FA.
  42. Be able to conduct and interpret multidimensional contingency table analysis, including logit analysis
  43. Be able to conduct and interpret both one-way and factorial MANOVA and MANCOVA.
  44. Be able to use basic multiple regression software and matrix manipulations to conduct path analysis.
  • Return to PSYC 7431 Home Page
  • Return to the PSYC 7433 Home Page