masthead710

Volume 55     Number 2    October 2017      Editor: Tara Behrend

Meredith Turner
/ Categories: 554

Ranking PhD I-O Programs by Development Opportunities

Nicholas Howald and Sami Nesnidol, Bowling Green State University; Kristin Horan, University of Central Florida; and Russell A. Matthews, University of Alabama

Previous rankings of I-O graduate programs have often focused on “objective” criteria related to research outcomes including publication rate and conference representation. Our ranking is differentiated by ranking I-O PhD programs according to development opportunities offered to graduate students in three core areas: teaching, research, and applied practice. Current PhD students were surveyed regarding the development opportunities available to them in their respective programs. We then weighted these opportunities using ratings from current SIOP members with experience in I-O related careers. We found that, under this ranking scheme, the most highly ranked schools varied across the three development areas, suggesting that prospective students should consider their future development and career goals when assessing potential graduate programs. Further implications and limitations of our results are discussed.  

 

An examination of the industrial-organizational (I-O) literature reveals no shortage of rankings of I-O PhD programs (e.g., Beiler, Zimmerman, Doerr, & Clark, 2014; Gibby, Reeve, Grauer, Mohr, & Zickar, 2002; Howard, Maxwell, Berra, & Sternitzke, 1985). Although there are exceptions (e.g., Kraiger & Abalos, 2004), these studies typically rely on metrics such as research productivity, journal prestige, and conference representation. These are important aspects of a program and they deserve to be quantified and investigated. As pointed out by Salter, Allen, Gabriel, Sowinski, and Naidoo (2016) in their Call for Proposals, however, these criteria may be less important for prospective graduate students whose career goals lay outside of research-focused academia. Students interested in applied careers, for example, may desire schools with more opportunities to develop and practice applied skills and thus will place less weight on research productivity in their decision making. This is especially pertinent to a field like I-O psychology, where approximately half of all SIOP members have a career with an applied focus (Silzer & Parson, 2011). Many prospective students may also be interested in an academic career with a focus on teaching. Most existing ranking methodologies do not incorporate information relevant to future teachers and therefore may result in students overlooking programs that excel in providing them with opportunities more closely aligned with their career goals (Salter et al., 2016).

 

Additionally, “objective” criteria such as publications or other indices of research success commonly used to rank programs (Beiler et al., 2014) fail to account for current student perceptions. Current students’ judgments are likely more pertinent to the needs of prospective students as the experiences of current graduate students are likely to be similar to those of incoming graduate students. To this end, Kraiger and Abalos (2004) used current student judgments on 20 different criteria to create rankings that are relevant for prospective students. They created an overall ranking, as well as rankings on three broad factors (program resources, program culture, and program costs) derived from their 20 criteria. The present study utilizes a similar approach by focusing on three broad factors that are important to I-O PhD programs. However, we developed ours a priori to focus on factors relevant to prospective students’ career trajectories. Heeding the call of Salter et al. (2016), we use ratings from current students to describe I-O PhD programs based on the development opportunities they provide.

 

Although current graduate students are an effective source of information on the opportunities and resources a program provides to graduate students, they are still in training and therefore, likely to have limited career experience. In comparison, trained I-O psychologists in academic and applied positions are likely to be better judges of the value of various development opportunities for learning the skills necessary to perform well in I-O related jobs. We therefore use current SIOP members employed in I-O related careers (e.g., consulting, research, academia) as subject matter experts (SMEs) on the importance of these development opportunities for the three core areas (teaching, research, and applied practice) instrumental to many I-O careers.

 

Development Opportunities           

 

We define development opportunities as opportunities that programs provide for students to engage in activities and practices that help students gain skills instrumental to careers as I-O psychologists. We categorize these opportunities into three broad areas: teaching, research, and applied practice. We acknowledge that these areas overlap for many I-O psychologists and that I-O careers often emphasize skills in more than one of these areas. This framework, however, allows prospective students to compare schools across multiple dimensions and make decisions based on the area(s) most related to their future career goals. For example, students who wish to pursue a career as a teaching-focused academic may weight programs’ teaching and research development opportunity scores more heavily when choosing which schools to apply to and which to attend. In contrast, a student who is less certain of what I-O career they would like to pursue (e.g., practitioner vs. academic) may choose to apply to schools that rank moderately high in all three development areas, rather than giving preference to the highest-ranking schools in any one area.

 

Teaching development opportunities provide students with opportunities to develop skills in various aspects of teaching such as managing classrooms, designing courses, evaluating student work, and relaying information to students. Programs that rank highly in this area provide students with more varied teaching experiences throughout their graduate career such as serving as a teaching assistant or instructor of record. We distinguish among several teaching opportunities, including assistantships with administrative duties, assistantships that involve teaching a lab or recitation, and serving as an instructor of record. A full list of these opportunities can be found in Table 1. Such development opportunities are likely to be valued by students pursuing a career in academia, especially those hoping to apply to teaching-focused schools.

 

Research development opportunities are conceptually the most similar to the traditional ranking criteria described previously. However, we assess opportunities that help students develop the skills necessary to carry out research rather than measuring the outcomes of research (e.g., journal publications). Programs that rank highly in this area encourage research in their students, provide multiple opportunities to learn necessary skills and methods to conduct high quality research, and provide collaborative, research-focused environments. For example, we examine whether programs provide opportunities for funding, collaboration with other researchers, and opportunities for research assistantships. Prospective students interested in a career in academia or an applied career with a research focus will likely benefit from programs offering more of these opportunities.

 

Applied practice development opportunities help students gain experience in applying I-O psychology research findings to organizations, interacting with organizational stakeholders, and communicating with lay

 

Table 1

Development Opportunities by Category

 

Category

 

Item

Wave 2 SME Rating M

Wave 2 SME Rating SD

Teaching

 

 

 

 

Opportunities to serve as a teaching assistant with grading or administrative duties

3.33

1.17

 

Opportunities to serve as a teaching assistant with duties including the instruction of lab or recitation sessions

3.96

.95

 

Opportunities to serve as an instructor of record

4.34

.99

 

A graduate level course on teaching or instruction of undergraduate students

3.59

1.05

 

Opportunities to gain feedback from faculty on teaching

4.05

.97

 

Opportunities for guest lecturing

3.69

.99

 

Opportunities to teach a variety of courses and levels

4.23

1.01

 

Opportunities to network with alumni in the academic (teaching) world

3.01

1.11

Research

 

 

 

 

Opportunities to be a research assistant

4.20

.90

 

Awards for excellence in research

3.03

1.03

 

A graduate level course on up-to-date research methods and statistics

4.48

.72

 

A graduate level course or workshop on grant writing

3.07

1.08

 

Opportunities to apply for internal funding for graduate student research

3.48

1.04

 

Opportunities to apply for external funding for graduate student research

3.37

1.00

 

Formal mechanisms (e.g., research groups) that encourage student involvement in student-driven research

3.83

.97

 

Formal mechanisms (e.g., research groups) that encourage student involvement in faculty-driven research

4.07

.88

 

Class projects that build research skills (e.g., research proposals)

4.07

.88

 

Encouragement of collaboration among researchers in other areas, departments, and schools

3.56

1.08

 

Conference presence and encouragement for students to attend conferences

4.13

.91

 

Opportunities to work with faculty and graduate students who are productive researchers

4.56

.75

 

Opportunities to work with faculty and graduate students who share your research interests

4.37

.78

 

Opportunities to network with alumni in the academic (research) world

3.60

1.02

Applied

 

 

 

 

Class projects that build applied skills (e.g., job analysis)

4.03

.92

 

A graduate level course on consulting and/or applied topics

3.83

.97

 

Opportunities for consulting through faculty

4.11

.94

 

An in-house consulting firm

4.00

1.05

 

Opportunities for consulting (other than through faculty and in-house consulting firms)

4.28

.91

 

Opportunities for applied internships or jobs

4.63

.70

 

Opportunities to network with alumni in the applied world

3.91

.99

 

Relationships with local businesses

3.84

1.02

 

A graduate level course on methods and statistics used in applied work

3.95

.99

Note. n = 826. Members of SIOP served as SMEs.

 

 

 

 

 

audiences. Programs that rank highly in this area may provide graduate students with opportunities to work on consulting projects overseen by faculty or include courses requiring the completion of projects emphasizing skills commonly used by I-O psychologists in applied settings (e.g., job analyses, performance appraisal). Prospective students interested in a career as a practitioner or as an academic who also consults independently or oversees student consulting projects may be interested in programs ranking highly in this area.

 

We assess the presence of development opportunities offered by programs in each of these three areas. In addition, the average earliest year in the program during which these opportunities are offered was also factored into the rankings. Programs that offer development opportunities to students earlier in the program allow students more flexibility in choosing when to engage in opportunities, as well as more time spent engaging in the opportunities relevant to their developmental and career goals.

 

The Present Study

 

In sum, we use two waves of data collection to rank programs on the extent to which they provide development opportunities in three core areas: teaching, research, and applied practice. In Wave 1, we used judgments from subject matter experts (SMEs) in I-O psychology PhD programs to derive a list of development opportunities within each area. In Wave 2, we collected information about which development opportunities were offered by their respective programs from students in I-O PhD programs. Judgments from professionals currently employed in I-O careers were used to weight the importance of these development opportunities for gaining skills in each of these three core areas.

 

Method

 

Wave 1

 

First, the first three authors independently generated lists of development opportunities for each of the three categories (i.e., teaching, research, and applied practice) before combining and agreeing upon 22 initial criteria. We then administered these initial criteria to a sample of 32 current I-O PhD students from Baruch College, the University of South Florida, and Bowling Green State University. These students served as SMEs for Wave 1. We contacted students from these three programs to ensure a well-rounded view of development opportunities to help prevent a single program from dominating the rankings simply because its students developed the criteria. Participants indicated the perceived helpfulness of each criterion for developing skills in the corresponding area on a five-point Likert scale (1 = not at all helpful to 5 = extremely helpful). Criteria with mean ratings of three or higher were retained. An open-ended item asking for additional criteria and their perceived helpfulness was included for each development area. After examining the item ratings and open-ended responses, a final list of 31 criteria (14 research, 8 teaching, and 9 applied) was retained for use in Wave 2 (see Table 1).

 

Wave 2

 

Participants. In Wave 2, we surveyed student affiliates and members of the Society for Industrial-Organizational Psychology (SIOP) via the SIOP listserv. SIOP members served as SMEs. A total of 3961 student responses and 826 member responses were used. The 32 participants from Wave 1 were contacted as a part of this effort. In the student sample, 119 (31%) identified as male, and 203 (53%) identified as female, with an average age of 28.20 years (SD = 6.86). The modal year in program was second. In the member sample, 340 (41%) identified as male, and 302 (37%) identified as female with an average age of 40.86 (SD = 12.26) years.

 

Procedures. We created separate surveys for SIOP student affiliates and SIOP members. Students were asked to indicate how many development opportunities their current program provided from the list of 31 criteria retained from wave one. Each criterion was rated using a 3-point scale (yes, no, and unsure). Only yes responses were considered as an indicator that a program provided that opportunity. Students were also asked to indicate which year in the program (e.g., first year, second year) each criterion was first offered to students. Last, they were asked to indicate to what extent they personally used or engaged in each opportunity offered by their program. This rating was made on a 5-point Likert scale (1 = not at all to 5 = a great deal). This last metric was used to examine whether students of some programs took advantage of more opportunities than others. These rating did not impact program rankings. To minimize bias, the authors did not provide ratings for their programs.

The SIOP members survey asked participants to rate how effective each of the retained Wave 1 criteria was to developing skills in the corresponding area (e.g., research; from 1 = not at all effective to 5 = extremely effective). Although not every SME we surveyed was necessarily an expert in every area, we believe averaging across many SME judgments gives a robust assessment of the importance of each opportunity.

 

Results

 

Table 1 includes means and standard deviations of SIOP member ratings of each criterion. Each program received a score for each developmental area (i.e., teaching, research, and applied), resulting in three separate indices per program. Scores were calculated through the following steps. First, a weighted sum of the number of development opportunities offered in each area was created for each participant. Criteria were weighted using mean importance ratings from SIOP members. This resulted in three development opportunity scores (teaching, research and applied practice) for each participant. These scores were then averaged across students for each program. To help protect against sampling error, only program scores derived from at least four student responses were calculated. This resulted in three development opportunity scores for each program. Last, each score was adjusted by dividing the average year in which development opportunities in the corresponding area were offered. Therefore, hypothetically, if a program offered every teaching development opportunity in the first year of the program, no adjustment would be made to the program’s teaching development opportunity score.

 

It is worth noting that the correlations between scores adjusted by year and unadjusted scores are high, ranging from .83 to .88. However, adjusting by year gives credit to programs that provide opportunities earlier. We believe that a development opportunity that is accessible earlier in the program is generally more valuable than one that can only be accessed in later years. Students have a limited amount of time in graduate school and earlier access to opportunities allows them more flexibility in choosing which opportunities to pursue, more experience with opportunities they are interested in, and more time to receive feedback and develop their skills in a given area. The mean, standard deviations, and intercorrelations for program scores in each of the three areas are provided in Table 2.

 

Table 2

Means, Standard Deviations, and Correlations for Program Scores

 

M

SD

Teaching score

Research score

Applied score

Teaching score

13.28

3.60

n = 37

 

 

Research score

35.57

10.02

.81*

n = 40

 

Applied score

16.18

5.88

.10

.31

n = 40

Note. *p < .01. Number of programs with a score in a given category is indicated on the diagonal.

 

The full rankings of programs are shown in Tables 3 through 5; we indicate the number of student affiliate responses for each program in parentheses. Given concerns such as sampling error, the difference between any two programs close to each other in rankings is unlikely to be practically meaningful. Therefore, we recommend that prospective students not pay too close attention to specific ordering of similarly ranked programs but instead use the tables to obtain a general picture of the opportunities that a program offers relative to the rest of the programs in our sample.

 

The adjusted rankings were compared to ratings on depth of involvement in opportunities. Interestingly, programs that offered more teaching development opportunities did not have students who were more involved in those opportunities, r = .17, p = .30. In contrast, programs that offered more research and applied opportunities had students who were more involved in those opportunities, r = .58, p < .001 and r = .38, p = .02, respectively.

 

Discussion

 

The resulting rankings showcase the variety of PhD programs offered in I-O psychology. Although, many programs appear within the top 50% for all three development areas, very few programs rank highly across all categories. This demonstrates the utility of the criteria used here, as programs’ provision of different types of development opportunities appear to be evident to students. Furthermore, it seems unlikely that students are uniformly rating their programs highly on every opportunity offered.

 

These rankings are useful for prospective graduate students faced with complex decisions about which programs to apply to, visit, and attend. Students can use these results to determine which programs may be best suited to their ultimate career goals. Students with well-defined career goals (e.g., applied practitioner) may choose to place more weight on schools’ rankings in one particular area (e.g., applied practice). In contrast, students with less well-defined career goals may opt to apply to programs that rank moderately well across all three development areas. It is worth noting that a low ranking on these lists is not meant as an absolute judgment of a given program—this is not an exhaustive list of all I-O programs and it represents just one method of conceptualizing rankings.

 

This study has several limitations. First, it is possible that some students purposefully misrepresented their programs (either by artificially inflating or deflating their ratings). As Kraiger and Abalos (2004) point out, this is a problem inherent to all subjective ranking criteria. Although such response distortion may occur, we agree with Kraiger and Abalos (2004) that, when possible, subjective ratings should be used in conjunction with more objective criteria to provide corroborating evidence. Additionally, we removed participants who indicated that their program provided all possible development opportunities and no participants indicated their program had fewer than 4 opportunities. Lastly, the rankings indicate that no program is in the top five for all categories, providing evidence that the effects of response distortion were likely limited.

 

We also adjusted raw scores in each development opportunity based on the earliest year opportunities were offered. This rewards programs who provide opportunities to students earlier in their graduate training. We acknowledge that there may be some situations where this does not necessarily promote development. For example, offering students the chance to teach a course early in their career without proper training may be detrimental to growth as a teacher. Nonetheless, again, we believe that a tendency to offer development opportunities earlier rather than later is ultimately beneficial to graduate students. As mentioned, the correlations between adjusted and unadjusted scores are very high, indicating that this adjustment does not greatly affect rankings. However, it does result in some changes to rank ordering. Interested scholars can contact the corresponding author to obtain the unadjusted rankings.

 

Table 3

Program Rankings in Teaching Development Opportunities

Ranking

Program

1

Texas A&M University (7)

2

University of Minnesota (9)

3

Northern Illinois University (5)

4

University of Georgia (10)

5

Michigan State University (11)

6

George Mason University (16)

7

Bowling Green State University (9)

8

Saint Louis University (4)

9

Auburn University (11)

10

University of Akron (12)

11

University of Oklahoma (9)

12

University of Missouri - St. Louis (12)

13

University of South Florida (4)

14

Colorado State University (11)

15

Old Dominion University (6)

16

Florida International University (7)

17

Clemson University (6)

18

Pennsylvania State University (6)

19

University of Connecticut (5)

20

Baruch College (19)

21

Portland State University (5)

22

Rice University (15)

23

George Washington University (6)

24

State University of New York at Albany (6)

25

University of Nebraska at Omaha (4)

26

University of Central Florida (6)

27

Roosevelt University (5)

28

University of Houston (10)

29

Louisiana Tech University (16)

30

Central Michigan University (8)

31

Florida Institute of Technology (5)

32

Illinois Institute of Technology (17)

33

University of Illinois at Urbana-Champaign (6)

34

The Chicago School of Professional Psychology (4)

34

Teachers College, Columbia University (6)

36

Capella University (8)

37

Grand Canyon University (16)

Note. Numbers in parentheses indicate sample size of student responses from that institution. Only programs with at least four responses are included in this table.

 

Table 4

Program Rankings in Research Development Opportunities

Ranking

Program

1

University of Minnesota (9)

2

University of Georgia (10)

3

Michigan State University (11)

4

Texas A&M University (7)

5

Pennsylvania State University (6)

6

Old Dominion University (6)

7

Florida International University (7)

8

University of South Florida (4)

9

University of Houston (10)

10

Colorado State University (11)

11

Roosevelt University (5)

12

Northern Illinois University (5)

13

Portland State University (5)

14

University of Nebraska at Omaha (4)

15

University of Missouri - St. Louis (12)

16

Auburn University (11)

17

Saint Louis University (4)

18

George Mason University (16)

19

Bowling Green State University (9)

20

Louisiana Tech University (16)

21

University of Connecticut (5)

22

Clemson University (6)

23

University of Oklahoma (9)

24

Baruch College (19)

25

Rice University (16)

26

Central Michigan University (8)

27

University of Akron (12)

28

George Washington University (6)

29

Teachers College, Columbia University (8)

30

State University of New York at Albany (6)

31

Illinois Institute of Technology (17)

32

University of Central Florida (7)

33

University of Illinois at Urbana-Champaign (6)

34

Florida Institute of Technology (5)

35

The Chicago School of Professional Psychology (4)

36

University of Phoenix (6)

37

Walden University (7)

38

Alliant International University (5)

39

Grand Canyon University (17)

40

Capella University (9)

Note. Numbers in parentheses indicate sample size of student responses from that institution. Only programs with at least four responses are included in this table.

 

 

Table 5

Program Rankings in Applied Development Opportunities

Ranking

Program

1

Louisiana Tech University (16)

2

Pennsylvania State University (6)

3

Roosevelt University (5)

4

The Chicago School of Professional Psychology (4)

5

Illinois Institute of Technology (17)

6

Alliant International University (4)

7

Michigan State University (11)

8

University of Nebraska at Omaha (4)

9

University of Houston (10)

10

Bowling Green State University (9)

11

Clemson University (6)

12

Saint Louis University (4)

13

University of South Florida (4)

14

Old Dominion University (6)

15

University of Akron (11)

16

Portland State University (4)

17

Florida Institute of Technology (5)

18

University of Georgia (10)

19

George Mason University (16)

20

Rice University (14)

21

University of Minnesota (9)

22

Colorado State University (11)

23

Northern Illinois University (5)

24

University of Connecticut (5)

25

University of Central Florida (6)

26

State University of New York at Albany (6)

27

George Washington University (6)

28

Texas A&M University (7)

29

Florida International University (7)

30

University of Oklahoma (9)

31

University of Missouri - St. Louis (12)

32

Central Michigan University (8)

33

Teachers College, Columbia University (6)

34

Grand Canyon University (16)

35

University of Phoenix (6)

36

Auburn University (11)

37

Walden University (6)

38

Capella University (8)

39

Baruch College (19)

40

University of Illinois at Urbana-Champaign (6)

Note. Numbers in parentheses indicate sample size of student responses from that institution. Only programs with at least four responses are included in this table.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The median number of students admitted to an I-O PhD program each year is about four (Tett, Walser, Brown, Simonet, & Tonidandel, 2013). Given the relatively small number of PhD students who are likely to be in a program at any given time, our rankings are subject to sampling error. Fortunately, the programs in the top 50% of each category had an average of 8.37 responses each. Nonetheless, rankings for programs with small sample sizes should be viewed with some caution. In addition, some subjectivity is expected in ratings. Averaging across student ratings may hide substantial variability in how students perceive, experience, and utilize development opportunities in a program. Different students will likely have somewhat qualitatively different experiences in the same program. Despite this, we found that, on average, raters tended to demonstrate high within-program agreement on scores in each development opportunity area (average ICC(2) = .93).

 

 Conclusion

 

In this study, we focused on parsimonious and broad criteria with prospective I-O PhD students as the intended audience. In general, it is important to note that no single ranking system is perfect. There are numerous criteria on which programs can be ranked and these criteria may differ in importance depending on who is using the rankings (e.g., current faculty, prospective students). The source of data may also affect rankings—perceptions of current students may differ from those of current faculty. It would be advantageous for prospective students to use our rankings in conjunction with other criteria (e.g., match with faculty research interest, publication rate, location). Many previously published rankings (e.g., Beiler et al., 2014; Gibby et al., 2002; Kraiger & Abalos, 2004) and other rankings created in response to the Call for Proposals by Salter et al. (2016) provide useful information on I-O programs and should also be consulted by prospective students. Overall, ranking I-O PhD programs on development opportunities provides a practically useful complement to other ranking methodologies.

 

 Note 

[1] We removed nine participants who indicated that their program provided all possible development opportunities, as this could indicate artificial rating inflation. These nine participants were each enrolled in a different I-O program. 

 

References

 

Beiler, A. A., Zimmerman, L. M., Doerr, A. J., & Clark, M. A. (2014). An evaluation of research productivity among I-O psychology doctoral programs. The Industrial-Organizational Psychologist, 51(3), 40-52.

 

Gibby, R. E., Reeve, C. L., Grauer, E., Mohr, D., & Zickar, M. J. (2002). The top I-O psychology doctoral programs of North America. The Industrial-Organizational Psychologist, 39(4), 17-25.

 

 Howard, G. S., Maxwell, S. E., Berra, S. M., & Sternitzke, M. E. (1985). Institutional research productivity in industrial/organizational psychology. Journal of Applied Psychology, 70(1), 233-236. doi:10.1037/0021-9010.70.1.233

 

Kraiger, K., & Abalos, A. (2004). Rankings of graduate programs in I-O psychology based on student ratings of quality. The Industrial-Organizational Psychologist, 42(1), 28-43.

 

Salter, N. P., Allen, J. A., Gabriel, A. S., Sowinski, D., & Naidoo, L. (2016). Call for proposals for updated graduate program rankings. The Industrial-Organizational Psychologist, 54(1).

 

Silzer, R. F. & Parson, C. (2011). SIOP membership and representation. The Industrial-Organizational Psychologist, 49(2), 8596.

 

Tett, R. P., Walser, B., Brown, C., Simonet, D. V., & Tonidandel, S. (2013). The 2011 SIOP I–O psychology graduate program benchmarking survey: Part II: Admission standards and procedures. The Industrial–Organizational Psychologist, 50(3), 13-34.

 

Print
11426 Rate this article:
5.0

Theme picker