Practitioners' Forum: The Intersection of Technology and Science: Perspectives on Drivers of Innovation in I-O Practice
Tracy Kantrowitz and Craig Dawson
with Practitioner Committee updates by Rich Cober,
Science and technology often work hand in hand to drive innovations in the practice of I-O psychology. Scientific methodology associated with foundational areas of psychology applied to the workplace allows us to be experts in the midst of HR trends and technological advances. But what happens when the lure of technology trumps the science? In what areas does I-O psychology need to shore up research, theory, and best practices to keep pace with the adoption of HR-related technological advances in the workplace? Conversely, what are some areas in I-O psychology viewed as “untouchable”? That is, are there areas relatively more immune to technology outpacing science? In this article we highlight examples of how technology set the stage for the creation of best practices, areas of I-O practice that have been preserved and even enhanced by technology, and areas of I-O practice in which science and technology have served complementary roles in innovations in organizations. We also provide ideas for how I-O practitioners can work more effectively with IT staff to meet common goals and ideas for challenging our long-standing assumptions of I-O practice to identify new/alternate methods for conducting research in organizations.
The Train Has Left the Station: Adoption of HR Technology Without Scientific Best Practices
The rise of technological advances can challenge our thinking and lead to psychological research and theory designed to keep pace with these advances. But in some instances, the adoption of new technology has outpaced scientific best practices. Organizations may desire to adopt new technology in the absence of best practices if the benefits are great. The rise of unproctored Internet testing (UIT) is a classic example of this phenomenon. Starting in the 1990s, organizations were attracted to the notion of testing candidates remotely to minimize costs associated with on-site proctoring, broadening applicant pools, and reducing administrative burdens (Tippins, 2009). Organizations plowed forward and left psychologists in the dust debating the appropriateness of UIT, and only recently have empirical investigations of the implications of UIT been brought forward (e.g., Arthur, Glaze, Villado, & Taylor, 2010; Beaty et al., 2011).
We anticipate this trend continuing in a variety of other areas. In addition to assessment and selection programs, technological systems are now the central components of training, performance management, applicant tracking, and company reward programs. Social media, with its new mechanisms of generating and tracking personal data, has impacted organizational recruitment, communication, and brand identity. Even greater impacts might be realized as social media becomes embedded within most aspects of our lives.
Technology has even changed the physical way in which work is organized, with remote working capabilities becoming increasingly common. Whether or not each of these changes is “good” for employees or organizations could be deep research areas, but similar to UIT, scientific questions are secondary to addressing perceived inefficiencies. Organizations are not stopping to introspect on how this might change our assumptions about work and our science. Indeed the now cliché notion that “the world of work is changing” has never been more true, as we’re increasingly finding ourselves in a place where communication, culture, and core rules about how processes are organized have drastically changed. How are scientists, with our ever-present desire to collect data, control extraneous variables, and test tightly defined hypotheses ever to find our voice?
The answer may be found in expanding beyond our comfort zone and applying core principles. Practitioners apply a method of observation, experimentation, and analysis to solve organizational challenges. But more fundamentally we’re also effective at study design. This may be our inroad with technological HR providers. Many technological applications address HR areas concerned with data management, which can be thought of as analogous to study design. Specifically, organizations are struggling to deal with large numbers of people in a standardized way to address some problem or issue. Information technology is very effective at data management. Mutual interest in data is where we both have a means to contribute. Mathematics, logic, and ultimately, how we make use of data are threads where I-O psychologists can help guide IT systems. If we show an expertise and an interest, we can become SMEs and stakeholders for IT initiatives, and by assisting in design, we can help set up IT systems up for success. Whether it’s the theory behind a training or motivation program, additional variables to categorize in a performance appraisal system, or identifying the types of data we’ll need for a validation program, we have multiple means to improve on data management systems. Data management and IT systems can be remarkably flexible in their design, so it’s up to us as I-Os to participate in such endeavors.
Don’t be afraid to learn and branch out. We recall in graduate school the divide between the students interested in statistics and those who avoided such topics. That choice impacts career options and ability to contribute to certain areas of our field. We see a second divide shaping up between I-Os who want to immerse themselves in technology and its capabilities for our science and those who prefer to make use of what is available. The former have the potential to be influencers as these technology-based answers are increasingly sought by organizations. Whether you work for a consulting firm or internally to an organization, better science can result from bringing I-O and IT closer together. Based on our experiences and conversations with colleagues and peers, direct IT knowledge may be less important than soft skills for enhancing this synergy:
1. Show an interest in technology
2. Demonstrate expertise in common area
3. Desire to be an SME for products or problem solutions
IT practitioners understand the technology. Where they can benefit is from our knowledge of how to use data and how to measure effectiveness of solutions. As I-O practitioners, we have the opportunity to monitor trends in technology, anticipate potential applications/implications for I-O practices, conduct research to craft best practices, and work with partners in IT to design solutions based on scientific methods that meet the needs of organizations. This close collaboration with IT can increase our relevance to organizations, drive meaningful innovations, and ensure that technology is implemented with scientific practices in mind.
Science Sets the Stage for Practice
Science serves as the foundation for the practice of I-O psychology and has been the cornerstone of a number of core tasks and responsibilities of
I-O practitioners even in the midst of new trends and technologies. I-O practitioners are experts when it comes to the study of jobs and designing and evaluating HR processes and systems. Our expertise in these areas is valued, but we must continue to maintain our relevance in light of technology-based systems that give the illusion of systematic and standardized processes. Nonetheless, the methodology and relevance of job analysis, selection system design, and validation research are entrenched in scientific methods, and organizations continue to look to I-O psychologists as “keepers” of this information. In these examples, technology has aided the science to increase efficiency in conducting job analysis and validation research. Web-based tools facilitate administrative aspects of data collection, including survey administration, tracking, aggregation, and even basic data analysis, and has allowed the I-O expert to focus on judgment-based and statistical analysis associated with this work. More broadly, the objectivity associated with statistical analysis used to demonstrate the value of a wide range of I-O processes has been a practitioner’s biggest commodity. The specialized training we receive and the application of statistical methods to HR processes will remain an asset and maintain our relevance to organizations.
How can I-Os continue to be viewed as the “keepers” of these processes so that organizations continue to rely on us for this kind of expertise? The answer in part depends on (a) how much we challenge ourselves to consider new and alternative methods for conducting organizational research and practicing I-O psychology and (b) the extent to which we can leverage technology to practice I-O psychology. For the former, consider for example the established methods of conducting job analyses and validation studies that are predicated on the availability of large samples. This may work for larger organizations, but what are smaller organizations left to do? If I-O practitioners uphold strict criteria for conducting research in organizations, it may mean our practice does not meet the broad needs of organizations. Creating/utilizing synthetic validity databases and using consortium research designs are examples of alternatives practitioners should consider (see Alternative Validation Strategies, McPhail, 2007, for other examples). Relatedly, we also have to work within the realities of organizations to understand their drivers and constraints to determine how to implement I-O practices that meet their needs. Within the same example, not all selection systems can handle multiple-hurdle, resource-intensive designs, and it’s up to us to determine how to identify alternatives to classic approaches. Doing so will continue to highlight our willingness to partner with our clients (internal or external) to tailor programs that meet the changing needs of organizations.
Leveraging technology to drive process efficiencies will also increase our relevance to organizations. Automating processes such as performance management, training evaluation, and satisfaction surveys via Web-based administration not only reduces administrative burdens but also offers the potential to make data available at our fingertips. Designing selection systems that use the power of applicant tracking systems centralizes complex processes and can allow for more objective decision making if processes are well defined for hiring managers. Although the end result may be a technology-enabled, elegant system, scientific information still exists behind the scenes.
Technology and Science Working Together to Do What They Do Best
As noted, technology can facilitate scientific work and challenge our thinking about how to bring the best of the two worlds together. An example of this is in the area of technology-enabled assessment (see Technology-Enhanced Assessment of Talent, Tippins & Adler, 2011). Computerized adaptive testing satisfies the increased demand for unproctored Internet testing while preserving test security. It utilizes sophisticated item response theory (IRT) algorithms and computer technology to deliver a tailored testing experience to test takers based on their ability/trait. CAT engines have been designed and are in use in a variety of certification, education, and employment applications and demonstrate improved measurement precision (compared to classical test theory alternatives) while bringing a variety of other benefits to organizations. Another example is multimedia-based assessments that enhance an organization’s employment brand as a leading-edge company and create more positive applicant perceptions. Video, animation, and virtual reality environments help deliver the assessment content in a more engaging and higher fidelity way compared to traditional text-on-screen computerized assessments.
In conclusion, these observations and experiences are designed to highlight gaps and opportunities for I-O practitioners to shore up research to support the use of technology in organizations, skills that facilitate our increased relevance to organizations, and ideas for leveraging technology to increase efficiency of I-O-based practices. The intersection of I-O and IT will remain on the forefront of I-O practice as evidenced by sessions on this topic at the SIOP conference. We look forward to continuing the discussion.
Practitioner Committee Updates and Notes
A top priority was to establish access to a research database for SIOP members. Our June pilot was a great success, and we continue to receive inquiries about when full-time access to research databases may become a reality for SIOP membership. I can report that the SIOPâ€ˆResearch Access Service is now available, so please sign up to use it.
Outreach to practitioner needs in other areas is taking the form of the continued evolution of SIOP’s membership programs. At this year’s LEC conference, our mentoring team led by Sam Ritchie and Mark Poteet debuted virtual mentoring. This was particularly appropriate for this year’s LEC given the theme around virtual workplace. At this year’s SIOP in San Diego, there will be the annual speed-mentoring session, and the team is working to establish a group-mentoring forum to round out our mentoring strategy. Carl Persing and Cathie Murensky are leading other work on the committee to create webinars for SIOP practitioner and research audiences. We are hoping to see webinars begin to appear early in 2012. Finally, our SHRM–SIOP partnership continues to move forward. We are currently working to identify authors that may be interested in contributing to this series who may be able to write interesting, HR practitioner-directed articles in the areas of leadership development, performance management, organizational culture and change, and/or general behavior management (as relates to health and wellness campaigns for example). If you are reading this and at all interested in writing for this series, please reach out to me directly by e-mail or consider attending the SIOP session we will devote to this series in the spring.
Arthur, W. Jr., Glaze, R. M., Villado, A. J., & Taylor, J. E. (2010). The magnitude and extent of cheating and response distortion effects on unproctored Internet-based tests of cognitive ability and personality. International Journal of Selection and Assessment, 18, 1–16.
Beaty, J. C., Nye, C., Borneman, M., Kantrowitz. T. M., Drasgow, F., & Grauer, E. (2011). Proctored versus unproctored Internet tests: Are unproctored tests as predictive of job performance? International Journal of Selection and Assessment, 19, 1–10.
McPhail, M. (2007). Alternative validation strategies: Developing new and leveraging existing validity evidence. San Francisco, CA: Wiley.
Tippins, N. (2009). Internet alternatives to traditional proctored testing: Where are we now? Industrial and Organizational Psychology: Perspectives on Science and Practice, 2, 2–10.
Tippins, N., & Adler, S. (2011). Technology-enhanced assessment of talent. San Francisco, CA: Jossey-Bass.