Society for Industrial and Organizational Psychology > Research & Publications > TIP > TIP Back Issues > 2017 > April

masthead710

Volume 54     Number 4    April 2017      Editor: Tara Behrend

Meredith Turner
/ Categories: 544

President’s TIP Column

President’s TIP Column

Once again time has moved rapidly by, and I find myself preparing the last of my presidential TIP columns.  A lot has happened over the last few months, and I can’t begin to discuss all of it, but here are a few of the exciting things going on with SIOP.

 

Most of you know by now that our incoming executive director is on the job.  Jeff Hughes began full-time on February 1 and is already busy helping out while learning as much about SIOP as he can.  We were pleased to have him attend the Executive Board meeting in January, where he had a chance to interact with the Conference planning groups and observe your Board in action—or at least in enthusiastic debate!

 

Preparations for the conference are coming along wonderfully, and we’re setting records for pre-registrations.  Executive Director Dave Nershi and the AO staff are staying ahead of the curve and making arrangements for additional space to be sure we have accommodations for everyone.  By now you have probably heard the news that our Conference Committee (chaired by Daisy Chang) was successful in recruiting Dr. Stanley Love, a NASA astronaut, to be the keynote speaker at our Closing Plenary session.  Parts of Dr. Love’s responsibilities include working on the planning for human exploration of the moon, asteroids, and Mars.  Thanks also to Steve Kozlowski for helping bring Dr. Love to our meeting.  The Program Committee (chaired by Zach Horn) has been working diligently to put together a stellar program while inaugurating new software designed to “make the job easier” (which it will in the long run).  I want to offer my very personal thanks to Tracy Kantrowitz (Program Committee CiT) for her and her team’s work on the Theme Track, which highlights my emphasis on I-O psychology’s future.  All in all, this conference is shaping up to be truly excellent.

 

There have been a couple of important things happening at APA over the last several weeks.  As you may have seen in Newsbriefs, APA has hired a new CEO, Arthur C. Evans, Jr., PhD, after a careful and thorough search by a special task force assembled for the purpose that included our own Rodney Lowman.  Dr. Evans is a clinical psychologist with extensive experience as an administrator having served as commissioner of Philadelphia’s Department of Behavioral Health and Intellectual disAbility Service and previously as deputy commissioner of the Connecticut Department of Mental Health & Addiction Services.  He is also widely published and served as an associate clinical professor in the department of psychiatry at Yale University School of Medicine. 

 

Although a new commission for reviewing and revising the current APA Ethical Standards and Code of Conduct has not yet been named, its formation is expected soon.  SIOP has been in communications with the chair of the APA Ethics Committee and current APA President Dr. Antonio Puente to urge them to include an I-O psychologist on the commission when it is formed.  We will be submitting nominations for this role when a call is issued.

 

The Executive Board held its Winter meeting at the Conference hotel in Orlando in January.  As you might guess, much of our time was spent reviewing the work being done to ensure the success of our conference in April; however, we addressed a plethora of other issues as well.  One of those I think is of particular importance.  As president I appointed, and the Board voted to fund a meeting of, a task force to investigate and develop strategies for improving and ensuring robust and reliable research.  Steve Rogelberg will chair the task force, which has already held a retreat that included both face-to-face discussion and a broad array of teleconference and video conference input from SIOP members who have been concerned about and at the forefront of these issues.  As stated in the charge to the task force:

 

When research is reproducible, replicable, and generalizable (cf. Cook, 2016, NSF 16-137, https://www.nsf.gov/pubs/2016/nsf16137/nsf16137.jsp), it provides credible and critical knowledge and supports subsequent research and application.  However, the credibility of psychological research is in question, as evidenced most recently by numerous problems identified in the field of social psychology—problems from which I-O psychology is not immune. The problems are tied, in good part, to questionable research practices (QRPs) that undermine the quest for robust and reliable research.

 

SIOP’s focus on these issues was the presidential theme presented by Jose Cortina in 2015.  The impetus for this Task Force came from a growing concern about these issues and a recent editorial in the Journal of Business Psychology reporting a review of the literature related to QRPs (Banks, Rogelberg, Woznyj, Landis, & Rupp, 2016). 

 

The more that I have examined the issues that will be considered by this group, the more I have come to view the issues of questionable research practices, reproducible research, and replication of scientific results as being of paramount concern to I-O psychology as both a science and a profession.  I think the risks associated with these issues fall into at least four domains. 

First, the issues raise questions of epistemology, that is, how we will determine what we can reasonably think to be accurate factual representations of the actual state of the external world.  If we cannot rely on the accuracy and veracity of research that is undertaken to evaluate our theories, we then lack the robust science we need to test them and, thus, devolve to intuition and guesses based on individual observations.  Karl Popper’s (1935) criterion of demarcation between true science and pseudoscience (and similarly those endeavors that are outside of the realm of science altogether) was that true science generates statements (hypotheses, if you will) that can be subjected to falsification by sound and replicable research.  He noted quite specifically that a single study, even if well conceptualized and conducted, seldom suffices to falsify strong statements from well-developed theory.  Both the inability to replicate findings and the failure to do so fail to provide strong tests of our theories, which in turn obviates our ability to revise, refine, or change those theories to better understand phenomena and provide a rational basis for knowledge.

Second, these issues raise questions for the science of I-O psychology.  When questions of unsound research practices arise, concomitant questions of the value of the research and justification for funding, whether public or private, also arise.  In addition, such research undermines data aggregation such as meta-analyses.  Even more insidious may be the extent to which such research sends researchers down unfruitful paths wasting valuable intellectual capital while diverting limited research funds from more meaningful investigations. 

Third, the work that is done in science is often inscrutable by persons untrained in our theory and scientific and analytical methods.  Thus, the acceptance of “evidence-based” practices very frequently rests on the confidence that non-scientists place in the work that is done in our field.  Psychology has historically faced particular concerns about the veracity of claims made or recommendations offered in part because the phenomena we study are complex and frequently not directly observable.  This problem, although long true for psychological research, is currently expanding to include other sciences as well, for example, climatology, with similar concerns about veracity.  Loss of confidence in the science presents practitioners with substantial concerns.  The organizations in which they work or to whom they provide counsel expect the research reported in the literature as well as the research conducted on their behalf to be both reliable and robust.  Moreover, when I-O psychologists testify in matters of litigation, they are relying on the research reported in the literature or conducted in their organizations, and courts are relying on that testimony to make substantive decisions under the law.

Fourth, when SIOP undertakes advocacy for the science of I-O psychology, we are seeking acceptance by policy and decision makers that the science that I-O researchers can provide offers a serious and reliable basis on which to predicate important policy decisions.  To the extent that poor research practices undermine confidence in the science, the impact that I-O psychology can make will be diminished.  Even more concerning, if such practices produce flawed knowledge and conclusions that then become the basis for substantive government policy and action, unintended and deleterious outcomes may arise.

I think the work of this Task Force has great significance for SIOP, I-O psychology, and science in general.  I do not expect that this one effort will resolve all of the issues or “turn the tide” in favor of robust and reliable research; however, it is my hope that it may lay the ground work for the long and difficult work ahead to ensure that our science and profession remain vital and valuable in providing solutions to make work and workplaces productive and positive. 

In addition to this group, I have appointed another task force to attend to a matter of importance to our advocacy efforts.  As part of those efforts we have sought improved methods for demonstrating the value and importance of I-O science to issues of national concern.  In consultation with our advocacy partners at Lewis-Burke, we have decided to investigate the feasibility of developing an evidence-based metric or metrics of workforce effectiveness to underscore our contributions.  The task force will be chaired by Rob Ployhart and (from its charter) is charged:

with investigating the feasibility of SIOP developing an index, measure(s), or indicator(s) that will assist public and organizational decision makers in accounting for the value of effectively deployed and managed human capital in making informed policy judgments.  The indicator(s) will be predicated on the best available scientific evidence.  Included within the scope of such an indicator(s) could be workplace health and safety issues, effective team functioning, valid selection and performance management systems, high-quality training and retraining programs, employee engagement and satisfaction, capable leadership, workforce diversity, organizational justice, employee retention/turnover, appropriate compensation practices, and/or related human resource characteristics.  If determined to be feasible, the purpose of the indicator(s) would be to highlight and take into consideration the implications of public policies and decision making for human beings in the workplace and the resulting contributions to or detractions from macro-economic success.

In February I had the opportunity to attend the 2017 IOOB Conference, which was held at Rice University.  The conference coordinators (Christy Nittrouer, Rachel Trump-Steele, Abby Corrington, Christina Lacerenza, Denise Reyes, Amanda Woods, and Jensine Paoletti) did an excellent job (prime candidates for serving on our Conference and Program Committees in the future).  The program featured examples of research being conducted by graduate students in multiple arenas as well as several invited addresses and roundtable meetings with students by leaders in our field such as Mikki Hebl, Murray Barrick, Fred Oswald, Jose Cortina, Juan Madera, Kelly Slack, Annette Spychalski, Jay Goodwin, Sophie Romay, and Ed Salas.  It is clear that I-O psychology has a strong and vibrant pipeline of new talent entering our field.

 

Finally, I want to offer congratulations to Vicki Vandaveer, Rodney Lowman, Ken Pearlman, and Joan Brannick on receiving this year’s award for the most outstanding article in Consulting Psychology Journal in 2016.  Their article “A Practice Analysis of Coaching Psychology: Toward a Foundational Competency Model” is an empirically based study of the professional practices of psychologists providing developmental coaching services.  This work started as a joint collaboration between the Society for Consulting Psychology (APA Division 13) and SIOP. 

 

When I started this journey last spring, I had no way of knowing what a challenging and ultimately fulfilling task I was undertaking.  I have gotten to work with many colleagues, some I’ve known for years and some I’m only just meeting and getting to know.  I’ve had the privilege to be part of SIOP’s continuing growth and impact.  I look forward to seeing all of you in Orlando as we continue look to the future of I-O psychology and the role that SIOP will play in it.

 

References

Banks, G. C., Rogelberg, S. G., Woznyj, H. M., Landis, R. S., & Rupp, D. E. (2016). Editorial: Evidence on questionable research practices: The good, the bad, and the ugly. Journal of Business Psychology, Published online: 25 June 2016.

Cook, F. L. (2016), Dear colleague letter: Robust and reliable research in the social, behavioral, and economic sciences. NSF 16-137, https://www.nsf.gov/pubs/2016/nsf16137/nsf16137.jsp.

Popper, K. (1935/1959/2002). Logic of scientific discovery (15th ed.). New York, NY: Taylor & Francis, Inc.

Vandaveer, V. V., Lowman, R. L., Pearlman, K., Brannick, J. P. (2016). A practice analysis of coaching psychology: Toward a foundational competency model. Consulting Psychology Journal: Practice & Research, 68(2), 118-142.

Previous Article From the Editor: Player Piano
Next Article Crash Course in I-O Technology
Print
2307 Rate this article:
No rating