Home Home | About Us | Sitemap | Contact  
  • Info For
  • Professionals
  • Students
  • Educators
  • Media
  • Search
    Powered By Google

The Top I-O Psychology Doctoral Programs of North America 

Robert E. Gibby 
Bowling Green State University 

Charlie L. Reeve 
Purdue University 

Eyal Grauer, David Mohr, and Michael J. Zickar 
Bowling Green State University 

  The task of identifying the best doctoral programs in various psychological disciplines has received a great deal of attention over the past 2 decades. In the area of industrial-organizational (I-O) psychology alone, several studies have attempted to rank graduate programs based on three main criteria: (a) program reputation as judged by psychology department chairs (e.g., U.S. News & World Report, 1995; 2001), (b) editorial board membership of faculty in I-O graduate programs (e.g., Jones & Klimoski, 1991), and (c) research productivity. With regard to the latter criterion, rankings have been based on counts of articles published in I-O-related journals (Levine, 1990; Winter, Healey, & Svyantek, 1995) and of student paper presentations at professional conferences (Payne, Succa, Maxey, & Bolton, 2001; Surette, 1989). 

Of course, much like Division I-A football polls, ranking of I-O psychology programs is not without controversy. Rankings based on program reputation have been sharply criticized for what amounts to criterion deficiency and criterion contamination (see Cox & Catt, 1977; Winter et al., 1995). In particular, reputational rankings may be based on outdated perceptions of faculty prestige (rather than productivity), biased by general perceptions of the university as a whole rather than the program in question, and rely on raters who are unlikely to be fully informed of the intricacies of each program. Because of these reasons, program reputations are usually considered a deficient index of a programs quality (Winter et al., 1995). 

Rankings based on faculty membership on the editorial boards of academic-based journals present an alternative method devoid of many of the problems associated with reputation-based rankings. However, this method is not without fault, either. Such an index gauges the productivity of the faculty member only indirectly, and does not fully gauge the faculty members contribution to the program. In fact, involvement on editorial boards may take away time otherwise spent engaged with graduate students and the institution. Further, this method penalizes programs with younger faculty who may be more productive but less well-established. Although this method of ranking captures some appropriate criterion space, it is clearly not sufficient. 

Considering the limitations and types of information provided by reputational and editorial board rankings, and based on the assumption that the primary purpose of a doctoral program is research, others have offered rankings of I-O graduate program quality based on research productivity. The advantage to this method is that it is somewhat more objective than the prior two criteria. Publications are seen as the direct result of research productivity and offer some control for the quality of research (assuming low-quality research will not make it through the peer-review process; an assumption that is not always appropriate). Prior studies using this method, however, have restricted their investigation to only a few outlets. For example, Levine (1990) counted articles published only in the Journal of Applied Psychology. Surette (1989) only tabulated student presentations for a single conference, the Annual Industrial-Organizational Psychology/Organizational Behavior Graduate Student Conference. Howard, Maxwell, Berra, and Sternitzke (1985) and Winter, et al. (1995) were somewhat more comprehensive, but still limited. Both of these studies offered rankings based on published articles in 5 top I-O-related journals (i.e., Academy of Management Journal, Academy of Management Review, Journal of Applied Psychology, Organizational Behavior and Human Decision Processes, and Personnel Psychology). Additionally, the Winter et al. (1995) study attempted to correct for differences in author contribution by using an index to assign more or less credit for authorship based on the total number of authors and the target authors rank order. The presumption being that a sole-authorship reflects more work by an individual than a person who was 5th author out of 12.

The purpose of the present study is to update and extend previous investigations of program research productivity. The last investigation of research productivity based on journal publications ended with the year 1995. As much can change in 6 years (e.g., faculty move, retire, pass away, new faculty are hired, research programs can end or start), we felt an update was needed. Secondly, the current study extends prior work by using a more comprehensive assessment of research productivity. Whereas previous work relied on a limited time frame and set of publication outlets, we provide program rankings based on two time frames (last 5 years and total career) and two sets of publication outlets (Top 10 I-O journals and total publications). 

Method 

The present study sought first to update the ranking system used by Winter et al. (1995). The rankings reported by Winter et al. assessed research productivity up to 1995. Therefore, the same 5 journals1 were consulted for the years 1996 through 2000 to obtain indices of research productivity for each I-O program. Within these journals, each article was checked for the authors institutional and departmental affiliation. Points were awarded only to faculty and graduate students of psychology programs and not to those of business schools or other disciplines. Points for each article were assigned to the authors within graduate psychology programs according to Howard, Cole, and Maxwells (1987) formula: 

                  n
credit = (1.5n-i) / (S 1.5i-1)
                   i=1

where n is the total number of authors on the published research and i is the author of interests position in the total group of authors. Therefore, according to the formula, a second author on an article consisting of a total of three authors was awarded 0.32 of a point. To obtain a ranking of the graduate program, the point totals of faculty and students (according to their institutional affiliation) were summed. 

1It is important to note that these five journals were independently found to be the most influential I-O journals by Zickar and Highhouse (2001).

The second goal was to provide a ranking of programs based on a broader set of publication outlets. The rationale being that articles are published occasionally by I-O psychologists in other I-O journals, as well as non-I-O-related journals, such as American Psychologist or Journal of Personality and Social Psychology. In addition, book chapters, books, and edited books are an important component of research productivity within departments (Nederhof, 1989). As such, we calculated four indices of institutional productivity. The first two indices were based on publications in the top 10 I-O journals as indexed by Zickar and Highhouse (2001). Rankings for the top ten journals were determined for both the past 5 years, and for the entire career of the faculty member regardless of whether or not the faculty member had resided at another institution prior to their current institutional status. The third and fourth indices ranked programs based on total publications for the time periods 19962000 and total career. Therefore, each institution received a rank in four different categories (2 time periods x 2 productivity indices). 

Finally, these four ranks were then summed and divided by four, yielding an average rank for the institution across the four indices. To correct for differences in the number of faculty, the resulting average rank was then divided by the number of faculty at the institution to provide an average per capita productivity rank of the graduate program. Although this may seem like a lot of rankings, we felt it was better to look at programs from multiple perspectives. Each index provides a different type of information about the program (e.g., recent productivity versus consistency over time). 

To calculate the indices, the Web sites of 19 of the top 20 graduate programs, according to the results of the top five journal rankings, and the Georgia Institute of Technology (which received inclusion based on its 10th-place ranking in the 2001 U.S. News index) were consulted for a listing of current faculty members. Columbia University, which was in the top 20, was excluded from the present investigation because of the lack of a PhD program in I-O psychology. All faculty listed as members of the I-O department were entered into PsychInfo (and Historical PsychInfo where necessary) to obtain a comprehensive listing of publications including journal articles, book chapters, books, and edited books for each faculty member (errata, obituaries, letters to editors, dissertations, and comments were not included). Emeritus faculty were not included in the determination of point totals. Once this list was generated, research productivity point totals based on Howard et al.s (1985) formula were obtained for each of the four productivity indices. 

Results 

Productivity ratings from the top 5 I-O-related journals are provided in Table 1 alongside the rankings provided by U.S. News and World Report (2001). As can be seen in Table 1, there is a disagreement in the rank ordering between the current top 5 journal rankings and the U.S. News (2001) rankings. In addition, two institutions in the U.S. News list are not in the top 10 based on publication rates in the top 5 journals. Of important note, however, is the fact that the top five journal rankings were the only rankings that captured student involvement in research, as they were not focused on faculty but on departmental affiliation. 

Table 1. Productivity Ratings of Psychology PhD Graduate Programs in the Top Five I-O Psychology-Oriented Journals, 19962000 

Institution

Total points in top five journals

U.S. News ranking 

1 Michigan State University 30.26 (1)  1
University of Illinois at UrbanaChampaign   26.64 (3) 6
3 Bowling Green State University 11.28 (4)
4 Florida International University  9.40 (20)
5 University of Akron 7.20 (9)  8
6 University of Minnesota 7.12 (7)  2
7 University of Michigan 6.25  
8 University of Maryland 5.72 (13)  4
9 Pennsylvania State University 5.01 (2) 3
10 George Mason University 4.93 8
11 Texas A&M University 4.78 (16)  
12 Purdue University 4.41 (8)
13 New York University  3.58 (11) 
14 Tulane University 3.28
15 University of South Florida 3.21 (19)  7
16 Colorado State University 2.91 (5)
17 University of Connecticut 2.71
18 University at AlbanySUNY 2.57  
19 Columbia University 2.50
20 University of Georgia  2.47 (10)  
21 University of Calgary 2.44  
22 DePaul University 2.42
23 Central Michigan University  2.40
24 Rice University 2.22  
25 Illinois Institute of Technology 2.04
26 Georgia Institute of Technology 2.03 10
27 Wright State University 2.00
28 University of Houston 1.96
29 Louisiana State University 1.95
30 Ohio University 1.82  
31 Kansas State University 1.61 (17)
32 University of MissouriSt. Louis 1.50 (6)
33 Portland State University 1.39
34 University of Waterloo 1.32  
35 Rutgers University 1.20
36  Wayne State University 1.14
37 Claremont Graduate University 1.00
38 Clemson University 0.92
39 Virginia Tech 0.92
40 George Washington University 0.88  
41 University of Central Florida 0.88

 Note. Parentheses represent the top 20 rankings provided in Winter et al. (1995). 

Table 2 presents I-O graduate program rankings based on publication in the top 10 I-O journals for the last 5 years and for total career. As displayed in Table 2, 8 of the top 10 programs from the top 5 I-O journal rankings remained in the top ten, with the University of South Florida and the University of Georgia moving ahead of Florida International University and the University of Michigan. In addition to changes in rank ordering between schools for the top 5 and top 10 I-O journal rankings, there exist discrepancies between career and overall productivity within schools. Table 2 yields some indication of how productive an institution has been historically (based on career output in the top 10 journals) as compared with its research productivity in top I-O outlets over the past 5 years. Considering this point, it is apparent from looking at Table 2 that some programs have slowed productivity over the past 5 years, while other programs have increased publication in top I-O outlets. 

 

Table 2. Productivity Ratings of Faculty in Psychology PhD Graduate Programs in the Top Ten I-O Psychology-Oriented Journals, 19962000 and Career 

Institution

Top ten journals research output: 19962000

Top ten journals research output: career

1 Michigan State University 18.04 71.82 (1)
2 Bowling Green State University  11.13 24.52 (8)
3 University of Minnesota  10.64 40.11 (3)
4 University of Illinois at UrbanaChampaign 9.88    33.36 (5)
5 University of Maryland 8.78   32.12 (6)
6 Pennsylvania State University 6.75  45.79 (2)
7 University of Akron  6.38  25.76 (7)
8 University of South Florida 6.28  33.50 (4)
9 University of Georgia 5.43  16.27 (12) 
10 George Mason University 4.07 22.24 (9)
11 Florida International University 3.91 9.83 (18)
12 Colorado State University  3.78  15.49 (14)
13 Tulane University 3.69   12.01 (16)
14 University of Michigan  2.93 10.58 (17)
15 Georgia Institute of Technology 2.52   19.16 (10)
16 University of Connecticut 2.4   9.41 (19) 
17 New York University 1.36   12.45 (15) 
18 Texas A&M University  1.27  17.21 (11)
19 Purdue University  1.26  15.65 (13)
20 University at AlbanySUNY 0   2.75 (20) 

Note. Parentheses indicate the institutional rank ordering of career output in the top ten journals. 

Table 3 presents rankings based on total publications in the past 5 years and total career. It is clear from Table 3 that certain programs have been productive in the top 10 I-O journals as well as all indexed research outlets historically and over the past 5 years. Specifically, Michigan State University and the University of Illinois at UrbanaChampaign remained in the top 5 for all five rankings computed in the present study. 

 

Table 3. Overall Research Productivity of Faculty in I-O Psychology Doctoral Programs Based on Total Publications, 19962000 and Career 

Institution

Total research output: 19962002

Total research output: career

1 University of Illinois at UrbanaChampaign 58.55  248.37 (1)
2 Michigan State University 41.63  135.50 (2) 
3 Bowling Green State University 28.34 57.39 (14)
4 University of South Florida 25.73  120.15 (3)
5 University of Akron 24.24  82.00 (7) 
6 University of Michigan 23.61 65.23 (9) 
7 University of Minnesota 23.29  82.34 (6)
8 Georgia Institute of Technology 22.58 83.74 (5) 
9 Colorado State University 21.92 64.90 (10) 
10 University of Maryland 21.42  78.31 (8) 
11 University of Georgia 20.02 64.05 (11)
12 Pennsylvania State University 18.41 112.89 (4)
13 New York University 13.17 61.96 (12)
14 George Mason University 11.83 61.70 (13)
15 Florida International University 11.2  25.05 (18)
16 University of Connecticut 9.71 51.07 (15)
17 Tulane University 5.35  19.84 (19)
18 Purdue University 3.43 30.00 (17)
19 Texas A&M University 2.3  36.10 (16) 
20 University at AlbanySUNY  1.6 9.18 (20)

Note. Parentheses indicate the institutional rank ordering of total research output for career. 

Last, we calculated each programs average ranking based on the four indices we used (shown in Table 4). The average rank provides an index of how productive programs were across all four of the research productivity indices shown in Tables 2 and 3. The average per capita ranks provided a measure of the productivity of the graduate program considering the number of current faculty present in the program; allowing for a more even comparison of productivity that is not confounded by department size. It is not surprising, considering the above discussion, that Michigan State University and the University of Illinois at UrbanaChampaign tied for top spot in the per capita rank. 

 

Table 4. Average and Per Capita Ranks of Productivity

Institution Average rank Rank Average per capita rank Per capita rank
1 Michigan State University 1.5 1 2 1
2 University of Illinois 2.8 2 10.3 9
3 University of Minnesota 4.8 3 2 1
4 University of South Florida  4.8 3 6.8  4
5 Pennsylvania State University 6 5 7.5 7
6 University of Akron 6.5 6 12.3 13
7 Bowling Green State University 6.8 7 6.8 4
8 University of Maryland  7.3  8 6.8 4
9 Georgia Institute of Technology 9.5  9 9 8
10 University of Georgia 10.8 10 12  12
11 Colorado State University 11.3 11 11.5 11
12 University of Michigan 11.5 12 17.8 19
13 George Mason University 11.5 12  14.5  16
14 New York University 14.3 14 4.3 3
15 Florida International University 15.5  15 10.3 9
16 Texas A&M University 16 16  14.8  17 
17 Tulane University 16.3   17  12.3  13 
18 University of Connecticut 16.5  18 17 18
19 Purdue University 16.8 19 12.5 15
20 University at AlbanySUNY 20 20 20  20 

 

                                                                       Discussion 

It is believed that this set of rankings provides a current and broader index of graduate program quality in I-O psychology. A great deal of effort went into designing rankings yielding different information. To begin, the top 5 journal rankings encompassed student involvement in research, as they were not focused on faculty but on departmental affiliation. In addition, the top 10 journal rankings provided an index of faculty research productivity on a broader range of the top I-O research journal outlets. Also, the overall productivity rankings provided an index of the total research production of faculty members in graduate I-O programs. Lastly, the average rank and per capita ranks provided a summarized ranking of productivity in graduate departments across all four cuts of the data (two top ten journal time frames and two overall productivity time frames) that were made in the present study. 

Differences between the various rankings can be explained with further analysis. For example, Michigan State University tops most rankings except the overall research productivity for all publications, which is topped by the University of Illinois at UrbanaChampaign. Many of the Illinois publications, though, were published by faculty affiliated with the I-O program (which is a socialorganizational program), but were published in non-I-O-related journals. Therefore, the discrepancy in rankings is informative to the nature of the various programs. 

It is hoped that these rankings provide a useful alternative index to the rankings recently published by U.S. News & World Report (2001). However, a note of caution should exist for the rankings presented in the current study. Specifically, the present rankings are limited because they get at student involvement only via the top five journals rankings. In addition, the rankings neglect other sources related to the quality of graduate education. Clearly the criterion space is much larger and likely multidimensional. Other factors such as number, variety, and quality of courses offered, faculty-student interpersonal relationships, student funding, and research and travel support are important factors when considering overall program quality. Although we feel program research productivity is an important factor in program quality, we do not assume that we are measuring all important aspects bearing on overall program quality. Our results should not be interpreted as such. Also, it should be noted that as with any set of rankings, the criteria most important to the consumer should be taken into consideration, as all rankings have limitations (Winter et al., 1995). 

References 

    Americas Best Graduate Schools. (1995, March 20). U.S. News and World Report. 
    Americas Best Graduate Schools. (2001, April 9). U.S. News and World Report. 
   
Cox, W. M., & Catt, V. (1977). Productivity ratings of graduate programs in psychology based upon publication in the     
journals of the American Psychological Association. American Psychologist, 32, 793813. 
    Howard, G. S., Cole, D. A., & Maxwell, S. E. (1987). Research productivity in psychology based on publication in the journals of the American Psychological Association. American Psychologist, 42, 975986. 
    Howard, G. S., Maxwell, S. E., Berra, S. M., & Sternitzke, M. E. (1985). Institutional research productivity in industrial/organizational psychology. Journal of Applied Psychology, 70, 233236. 
    Jones, R. G., & Klimoski, R. J. (1991). Excellence of academic institutions as reflected by backgrounds of editorial board members. The Industrial-Organizational Psychologist, 28(3), 5763. 
    Levine, E. L. (1990). Institutional and individual research productivity in I-O psychology during the 1980s. The Industrial-Organizational Psychologist, 27(3), 2729. 
    Nederhof, A. J. (1989). Books and chapters are not to be neglected in measuring research productivity. American    Psychologist, 44, 734735. 
    Payne, S. C., Succa, C. A., Maxey, T. D., & Bolton, K. R. (2001). Institutional representation in the SIOP conference program: 19862000. The Industrial-Organizational Psychologist, 39(1), 5360. 
    Surette, M. A. (1989). Ranking I-O graduate programs on the basis of student research presentations. The Industrial-Organizational Psychologist, 26(3), 4144. 
    Winter, J. L., Healy, M. C., & Svyantek, D. J. (1995). North Americas top I-O psychology doctoral programs: U.S. News and World Report revisited. The Industrial-Organizational Psychologist, 33(1), 5458. 
   
Zickar, M. J., & Highhouse, S. (2001). Measuring prestige of journals in industrial-organizational psychology. The Industrial-Organizational Psychologist, 38(4), 2936.

 

April 2002 Table of Contents | TIP Home | SIOP Home