To view this page ensure that Adobe Flash Player version 11.1.0 or greater is installed.
The 2011 SIOP Graduate Program Benchmarking Survey Part 8: Correlations and Latent Themes Robert P. Tett, Benjamin Walser, and Cameron Brown University of Tulsa The 2011 SIOP survey of I-O graduate pro- grams was undertaken to identify norma- tive benchmarks of current practices in the education of I-O practitioners, researchers, and educators. The data offer three main uses. First, they allow individual programs to see where they stand in comparison to peer programs (i.e., MA vs. PhD, psychol- ogy vs. business/management), offering confirmation and exploration of program identity (e.g., for marketing purposes) and leverage in securing better resources (e.g., to raise stipends to competitive levels). The second use is as a baseline for tracking changes over time in how I-O programs are composed and managed. Seeing trends in I -O education could offer uniquely valuable insights into where the field is headed in light of where it’s been. The third applica- tion is to advance discourse on how to im- prove graduate education in I-O, with an eye to the possibility of licensure and pro- gram accreditation. Regardless of where one stands on those controversial issues, hard data serve more informed discussion. Each of the previous seven installments provides a relatively pixelated snapshot of a major part of I-O graduate training (basic program features, admissions, curriculum, assistantships, internships, comprehensive exams, and theses/dissertations). Here, in The Industrial Organizational Psychologist our last installment, we attempt to take stock of what the data mean collectively. This is no easy task, as there are hundreds of variables offering thousands of relation- ships, all with limited power imposed by an overall modest sample size. Identifying major themes seems a reasonable pursuit, nonetheless, which is our goal here. There are many ways to distill a dataset such as ours. We tried a series of "nested" principal components analyses (with oblique rotation), starting with variables within a given table, repeating across tables in the same TIP article, all leading to a third-order PCA of lower factors from all seven articles. Difficulty in interpreting factors led us to a sim- pler, regression-like correlational strat- egy beginning with a putative distinction between IVs and DVs. Five sets of variables were selected as IVs because of their uniquely informative quality: (1) program type (degree type, department type), (2) basic program fea- tures (department size, program size, number of graduates per year), (3) SIOP competency factors (I-focused, O- focused, methods, individuals/teams, general psychology, applied cognition), (4) self-rated preparation of students for I 163