Home Home | About Us | Sitemap | Contact  
  • Info For
  • Professionals
  • Students
  • Educators
  • Media
  • Search
    Powered By Google

I-O Graduate Program
Rankings: Update


Nicholas P. Salter, Joseph A. Allen, Allison S. Gabriel, 
Loren J. Naidoo, and David Sowinski


In the summer 2016 issue of TIP (Salter et al., 2016), we put out a Call for Proposals for updated I-O graduate program rankings. In this call, we were looking for new and unique methodologies for ranking I-O graduate programs that reflect the diversity of values and strengths across our field. We are pleased to announce we have conditionally accepted five proposals. Each of these teams will now conduct their project (which we hope all SIOP members will help with once data collection begins); we anticipate the final rankings will be published in the summer 2018 issue of TIP.

Please note that the goal of this project is to make our methodologies public before data collection to reduce the likelihood of researcher degrees of freedom influencing the results. Although frequently unintentional, it is all too common for analytic decisions to be driven in part by the results that follow. Our goal is to achieve transparency in the way rankings were conducted and to present multiple methodologies, to aid students and educators in their decision making. For more information about preregistration, see Open Science Framework https://osf.io, or List and McDaniel (2016).

Finally, another goal of this announcement is to “heads-up” all TIP readers that sometime next spring, you will receive requests to complete a survey gathering data for these projects. Please be sure to fill it out! If you don’t, your program may not be represented in the rankings (and therefore the rankings won’t be as complete as possible). It is important for as many SIOP members to help with data collection by completing survey requests next spring as possible. We thank you in advance for your participation!

Below are brief summaries of the projects.

A Proposal to Rank I-O Master's Programs Using Objective Data From I-O Coordinators

Stephen Vodanovich Ph.D., Valerie J. Morganson, Ph.D., and Steven J. Kass, Ph.D., University of West Florida

Our proposal to rank terminal MA/MS programs plans to collect objective, quantifiable data, via an online survey, from coordinators of MA/MS I-O programs. Our goal is to identify and place various criteria into conceptually similar clusters. This will allow the ranking of programs within discrete categories, as well as provide an overall ranking. Examples of criteria and related clusters include: (a) applied experience (e.g., number of I-O classes requiring external consulting projects, percent of enrolled students who choose and complete internships annually, number of I-O classes requiring formal presentations for applied audiences), (b) faculty expertise/information (e.g., percent of I-O faculty with consulting experience, percent of I-O faculty who supervise students on applied projects, number of I-O faculty), (c) student accomplishments/information (e.g., percent of graduates who obtain work in an I-O-related job within a year after graduation; percent of student members of an I-O-related student chapter (SHRM, ATD); percent of students with assistantship positions), and (d) curriculum (e.g., total required I-O program credit hours; number of required internship hours; ratio of full-time I-O faculty to I-O master’s students; frequency of I-O course offerings; explicit coverage of consulting skills, legal issues, ethical issues, and diversity).

Rankings of I-O Psychology Master’s Programs Based on Program
Experience and Satisfaction: A Multiple-Perspective Evaluation

Yalcin Acikgoz, Ph.D., Tim Huelsman, Ph.D., Amanda Dixon, B.A., Amanda Ross, B.A., Jessica Swets, B.A., Ryan Olsen, B.S., and Stephanie Jeffer, B.S., Appalachian State University

Although the importance of objective outcomes such as research output or graduation rate are beyond any dispute, it is also true that program experience is a factor that is likely to be taken into account by prospective graduate students. Especially for master’s level programs in which there is a weaker emphasis on research productivity, program experience matters. Accordingly, this proposal, which is aimed at ranking master’s level programs, focuses mainly on soft measures of program success such as program culture, program resources supporting student success, and overall satisfaction with the program. This will ensure that in addition to objective outcomes, program experience is also reflected in the rankings. In order to provide a different and rich perspective of program success and achieve comprehensiveness, the team will be collecting data from current students, alumni, and employers of recent alumni to make sure each group of stakeholders provide input in the rankings.

An Interdisciplinarity Index for I-O Psychology Graduate Programs

Richard N. Landers, Ph.D., and Michael B. Armstrong, M.S., Old Dominion University

In science, a discipline can be defined as a group of researchers with a central problem to be solved, agreed-upon facts relevant to solving that problem, producing explanations, goals and theories to address that problem (Wagner et al., 2011). Importantly, disciplines have existed in science for only the last few hundred years; science has historically taken a “big tent” approach, with researchers across the world collaborating as needed. The shift over time to discipline-centered science has had numerous effects, both positive and negative. In response, there is an increased call among funding agencies to reward and recognize interdisciplinary research. Despite this, I-O psychology has no way to identify successful interdisciplinary researchers. If a prospective graduate student wants to pursue graduate education with a broad perspective, or even graduate education with a particular interdisciplinary impact, conducting research that benefits those both inside and outside of I-O psychology, there is no clear way to identify which programs will most likely provide that opportunity. To provide guidance to such students, we will develop several indices of interdisciplinarity to capture the degree to which I-O psychologists contribute to other disciplines and also are influential within other disciplines. Furthermore, we will present this information within an analytic dashboard, demonstrating the most common disciplines outside of I-O psychology that those programs contribute to and influence.

Ranking PhD I-O Programs by Quantity and Quality of Development Opportunities

Nicholas Howald, B.A., Samantha Nesnidol, B.A., and Kristin Horan, M.A., Bowling Green State University

The proposed study will rank PhD I-O programs according to the type and quality of development opportunities they offer to graduate students. These development opportunities will be categorized into three areas: applied, academic (teaching), or academic (research). Development opportunities are defined as aspects of a program that provide graduate students the chance to develop knowledge, skills, and abilities that are relevant to these three broad areas of I-O. An example development opportunity for the applied category is the presence of an in-house consulting firm; an example from the academic (teaching) category is the opportunity for students to serve as an instructor of record. Graduate students will rate their programs on each of the development opportunities with regard to (a) whether or not the program offers that opportunity and (b) how early it is offered. Subject matter experts (SMEs) will also provide ratings of importance of each opportunity. The student ratings will be weighted by the SME ratings before being averaged. This will result in a single score for each program in each of the three areas. This ranking system will allow prospective students to compare programs within three broad areas of I-O, thus giving them the opportunity to select programs based on their respective interests and career goals. The provision of three separate scores for each program (rather than one holistic rating) also allows programs to showcase their strengths in specific areas.

I-O Graduate Programs Rankings Based on Student Perceptions

Jenna-Lyn R. Roman, B.A., Baruch College, and Christina N. Barnett, B.A., University of South Florida

Prospective graduate students must make informed decisions about the program they choose to attend based on many criteria to determine which school best fits their needs. Using student perceptions of quality to rank graduate schools is one method of evaluating those criteria (i.e., Kraiger & Abalos, 2004). This study will consist of three phases. During criterion development, a sample of current students will be asked to list the criteria typically used to evaluate or choose a graduate program (e.g., research interests of faculty, research productivity, availability of funding). Then, the list of criteria will be rated by a sample of program directors, faculty, and students to determine the importance of each variable as it relates to the quality of graduate programs. Ratings will be gathered through surveys from the current population of graduate students on their perceptions of their program during the final phase of this study to determine program rankings. Overall rankings and rankings of selected criteria (e.g., program culture, program costs) will be determined for PhD and MA/MS programs respectively. The primary goal of this research is to update student evaluations of master’s and doctoral programs. This will provide a more holistic evaluation of industrial and organizational psychology programs to prospective graduate students.


Kraiger, K., & Abalos, A. (2004). Rankings of graduate programs in I-O psychology based on student ratings of quality. The Industrial-Organizational Psychologist, 42(1), 28-43.

List, S. K., & McDaniel, M. A. (2016). I-O Psychology’s lack of research integrity. The Industrial-Organizational Psychologist, 54(2).

Salter, N. P., Allen, J. A., Gabriel, A. S., Sowinski, D., & Naidoo, L. (2016). Call for proposals for updated graduate program rankings. The Industrial-Organizational Psychologist, 54(1).

Wagner, C. S., J. D. Roessner, K. Bobb, J. T. Klein, K. W. Boyack, J. Keyton, …Bomer, K. (2011). Approaches to understanding and measuring interdisciplinary scientific research (IDR): A review of the literature. Journal of Informetrics, 165, 14-26.