Home Home | About Us | Sitemap | Contact  
  • Info For
  • Professionals
  • Students
  • Educators
  • Media
  • Search
    Powered By Google

Caveat Emptor: The Gourman Report

Arthur G. Bedeian
Louisiana State University

 Several years ago two colleagues and I began searching for a ranking of U.S. management departments for a career study we had been discussing. Among the various rankings available, The Gourman Report seemed to be especially appropriate, having been used in articles that appeared in many distinguished journals, including American Economic Review, Journal of Human Resources, and Social Forces. As noted by a set of respected colleagues, The Gourman Report offers the only guide to higher education quality that assigns numerical scores measuring university quality, and has consequently been used by a number of researchers (Judge, Cable, Boudreau, & Bretz, 1995, p. 498). Upon retrieving The Gourman Report from the LSU library, I was informed by the reference librarian on duty that this volume was provided to patrons upon request, but only with a standard warning reflected in the concluding statement of a review of the Report that appeared in the Wilson Library Bulletin, Only an irresponsible reference librarian would add [the Gourman Report] to a collection (Rettig, 1993, p. 124).

Principal among the concerns related to The Gourman Report is its authors (i.e., Jack Gourman) refusal to reveal his methodology. As noted in The Chronicle of Higher Education, most of the scores in the most recent Gourman Report (1997), covering 105 disciplines and 1,273 undergraduate institutions, differ by only 1/100th of a point, with no wider gaps and no tiesan outcome that researchers call a near-impossibility, statistically (Selingo, 1997, p. A45). The Chronicle goes on to note, Whats more, college officials have no idea how the rankings were determined, because no one ever contacted the institutions for the information (Selingo, 1997, p. A45). In Gourmans defense, the Chronicle acknowledges that Gourman says he derives his rankings by averaging scores for 10 factors related to program quality (Selingo, 1997, p. A46). At the same time, however, the Chronicle states that Gourman refuses to elaborate on his criteria and on how those factors can be quantified for a numerical scale (Selingo, 1997, p. A46). It would appear that the factors are weighted in some unspecified fashion. For example, according to Chronicle calculations, Princeton University has an overall score of 4.95, but the mean of its 43 ranked programs is 4.72. In Gourmans further defense, the Chronicle reports that rather than collect information from an institutions officials, Gourman says he relies on letters received from faculty members and others who write him about their programs, as well as 50 trained people working around the country. Gourman adds that the letters he receives are destroyed to protect the identity of his sources. Just how these sources would have access to essential information necessary to make the judgments Gourman offers is thus unclear. He has further stated that explaining his research methods more fully would be confusing, and because his methodology would run hundreds of pages, it would be too expensive to print with his reports (Webster, 1984, p. 16; see also Webster, 1986). In response, one must nonetheless wonder how Gourmans methodology would have allowed him, for example, to rank nonexistent departments at, for instance, the University of Chicago and Claremont McKenna College. When Evan Schnittman, The Gourman Reports editor, was asked by Chronicle reporter Jeffrey Selingo (1997) to elaborate on Mr. Gourmans criteria for ranking programs, Selingo was told that no explanation was needed and that only reporters and not consumers are concerned with methodology.

The purpose of this note is quite simple: to advise caution in taking program rankings at face value. Offering such a caveat to SIOP members, with their methodological sophistication, may seem odd. After all, without information about a studys sources and methodology, its claims would never be accepted by the SIOP membership. Yet, an Internet search reveals that, with recent articles appearing in Personnel Psychology, the Academy of Management Journal, and Organization Science, some of our colleagues may have unwittingly relied upon The Gourman Report for source data. Moreover, the same search further reveals that many of our institutions unhesitatingly publicize their rankings in The Gourman Report on their World Wide Web sites.

Program rankings are available from a variety of sources. Such sources range from what some regard as the arbitrary opinions of individuals to teams of scholars working on behalf of private, nonprofit organizations chartered to advise the federal government. As my colleagues and I learned in our search for program rankings, knowledge of how such rankings are constructed is essential to ensure that they are used appropriately.

References

   Gourman, J. (1997). The Gourman Report (8th Ed). New York: Random House.
   Judge, T. A., Cable, D. M., Boudreau, J. W., & Bretz, R. D., Jr. (1995). An empirical investigation of the predictors of executive career success. Personnel Psychology, 48, 485519.
   Rettig, J. (1993). Current reference books. Wilson Library Bulletin, 67(10), 119124.
   Selingo, J. (1997). A self-publishing college guide goes big-time, and educators cry foul. Chronicle of Higher Education, 44(11), pp. A45-A46.
   Webster, D. S. (1984). Who is Jack Gourman and why is he saying all those things about my college? Change, 16, 14-19; 4556.
   Webster, D. S. (1986). Jack Gourmans rankings of colleges and universities: Guide for the perplexed. RQ, 25, 323331.

 

January 2002 Table of Contents | TIP Home | SIOP Home