Volume 55     Number 4    Spring 2018      Editor: Tara Behrend

Meredith Turner
/ Categories: 554

The Bridge: Connecting Science and Practice: Prehire Screening: A Case Study at CVS Health

Margaret Collins and Meredith Vey, CVS Health

Column Editors: Mark L. Poteet, Organizational Research & Solutions, Inc., J. Craig Wallace, University of Denver & AOE Science; and Lynda Zugec, The Workforce Consultants

  “The Bridge: Connecting Science and Practice” is a TIP column that seeks to help facilitate additional learning and knowledge transfer in order to encourage sound, evidence- based practice. It can provide academics with an opportunity to discuss the potential and/or realized practical implications of their research as well as learn about cutting edge practice issues or questions that could inform new research programs or studies. For practitioners, it provides opportunities to learn about the latest research findings that could prompt new techniques, solutions, or services that would benefit the external client community. It also provides practitioners with an opportunity to highlight key practice issues, challenges, trends, and so forth that may benefit from additional research. In this issue, we profile CVS Health, winner of the 2017 HRM Impact Award for its implementation of an evidence-based, prehire screening assessment for call center job applicants.

Overview of CVS Health

We at CVS Health were honored to receive the 2017 HRM Impact Award. We truly believe in the science–practitioner model for which we were recognized and hope that others consider this prehire screening assessment as a model from which to learn and work. Within this column, we provide a case study of how a team of I-O practitioners applied science and common business sense to build and implement a prehire screening assessment and showcase its value to the highest levels of the organization.

CVS Health is the nation’s leading pharmacy innovation company “helping people on their path to better health.” These eight simple yet powerful words are more than a mission—it’s our purpose, and it’s what guides the company’s 243,000 colleagues as they go about their daily business.

As a Fortune 10 company with a unique integrated business model, CVS Health is not your traditional drug store. We have built a combination of capabilities to work with patients, payers, and providers to offer access to lower cost, high quality sites of care and help improve the lives of millions of Americans. To provide a sense of the company’s size and scale, last year CVS Health dispensed or managed about 1.9 billion prescriptions. About half of those were filled at our pharmacy locations and the rest were managed by CVS Caremark, which is the company’s leading pharmacy benefits manager (PBM) with nearly 90 million plan members.

Operating at that scale gives us a tremendous opportunity to make a difference in improving the health of our patients, communities, and colleagues. But it can also challenge us to compete for the talent we need to sustain our operations and thrive in today’s dynamic economy. Our strategic priority to develop a strong pipeline of future talent led us to identify an opportunity to enhance our hiring practices at our customer care call centers. As we share methodology and outcomes, we want to highlight our learning on the practitioner side, which emphasized how “bridging” science and practice was primarily about engaging our business partners early and throughout the project, and demonstrating the value and return on investment they would reap.

Development and Implementation of Customer Care Representatives Pre-Hire Screening Assessment

Because of the cyclical demands of call center staffing in our industry, we hire Customer Care Representatives (CCRs) at a high volume during certain seasons of the year. CCRs are often the front line to our patients, payers, and providers when they need to communicate with our PBM. Our particular challenges were:

a.       We interviewed most applicants, creating a slow and burdensome process for applicants and hiring managers.

b.      We based hiring decisions on resumes and interview performance, but this didn’t guarantee that new hires had the required technical skills (e.g., ability to toggle and type) while providing excellent customer service.

c.       Our applicants are also our customers, so we constantly strive to maintain a positive perception of the CVS Health brand in the marketplace. 

Thus, the goals of our intervention were to improve and streamline the existing selection process while providing an engaging candidate experience and increasing new hire performance.

Applying the Science of Psychology to our Screening Solution

We adhere to best practices as prescribed by the Uniform Guidelines on Employee Selection Procedures (EEOC, 1978) and the Standards for Educational and Psychological Testing (AERA, APA, & NCME, 1999), and develop job-related assessment content. This involves:

  • Conducting extensive job analysis to ensure content validity
  • Using a battery of measures, such as situational judgment, high fidelity work simulations, and biodata to measure a range of technical competencies and work fit
  • Including a realistic job preview to engage applicants and provide additional context so they could evaluate job fit (Kang & Yu, 2014)
  • Conducting a concurrent validity study to refine content and develop scoring algorithms that maximize validity and minimize subgroup differences
  • Pilot testing assessments to ensure they function as intended
  • Continuous monitoring of results

After the CCR assessment had been in place for a year, we examined its impact on administrative efficiency and quality of hire. We had several robust metrics that were available before and after implementation, allowing us to apply a pre–post quasi-experimental design comparing new hires sans assessment to new hires with an assessment. It was also impactful to calculate the hours and dollars saved by automating the first stage of screening, given that prior to implementation, we had a high touch process to screen all applicants. For more detailed information about our methodology and the significant gains in effectiveness and efficiency, see our HRM Impact Award profile.

Bridging Science and Practice

Before delving into how we spanned the bridge, we would like to highlight the advantages of conducting research in the business world. First, our work directly impacts business outcomes. Research conducted with real incumbents and job applicants may yield critical information that confirms or enriches research findings conducted in controlled settings. In addition, as we have found at CVS Health, the inarguably positive results of successfully implemented assessments set a precedent and facilitate future opportunities for similar projects.  

Still, even a rigorously developed and vetted screening assessment requires significant commitment from leaders who are ultimately focused on running their business. We identified three primary methods of impressing on leaders the importance and value of our work: evidence-based science, project management, and stakeholder alignment. We’ve shared our evidence-based approach, and for project management practices we refer you to information shared by previous HRM Impact Award winners, such as Jack-in-the-Box (Schiemann & Seibert, 2017) and Bank of America (Niedle & Littrell, 2016). We outline best practices for stakeholder alignment below, in the hope that they provide take aways for early-career I-O practitioners and students on how to maintain stakeholder enthusiasm.

1. Identify your stakeholders. It is critical to know who your stakeholders are and understand their interests and preferences related to the resources you are about to request and the impact of your project on their responsibilities. The most obvious stakeholders are operational leaders. Below are examples of other stakeholder groups who have been integral to the success of the CCR project:

a.       Talent acquisition (TA): Prehire screening assessments are resources for and part of the acquisition process. We share with TA the goals of engaging applicants early in the process and building a high quality workforce. At the same time, there is an inherent tension between our goals of both screening out applicants and screening them in, as quickly as possible.

b.      Human resources business partners (HRBPs): HRBPs are the practitioner’s best friends. They help us reach operational leaders on whom we rely for resources (e.g., job analysis participants, content reviewers, raters). Furthermore, they are often the first colleagues that hiring managers turn to when they have questions (or doubts) about the tool or the process.

c.       Vendors: We want work done well and economically. Vendors need to adhere to science and regulatory requirements and best practices. Although there is not just one right way to achieve this objective, it is important to identify the right blend of efficiency and integrity. It is also important to maintain transparency across multiple vendors, especially regarding progress and risks that might impact their work.

d.      Applicants: Job applicants are our customers and even if they do not receive—or choose to accept—an opportunity, we want to leave them with a positive impression of our brand.

2. Never stop working for stakeholder buy-in. Obtaining buy-in is not just an up-front activity. It requires continuous work from start to finish of a project and well past implementation. As new colleagues enter hiring leader roles, as tenured colleagues’ familiarity with the original implementation ebbs over time, and as applicant volume fluctuates (creating pressure to fill roles), practitioners have to reiterate why we use an assessment and how it brings value. The subsequent best practices are approaches we found valuable to obtain buy-in for the CCR assessment.

3. Manage expectations. The development of a customized, valid assessment is not a speedy solution. Transparency and communication is critical and the first and most important information for stakeholders is:

  • A realistic timeline;
  • A relatively accurate description of the resources required; and
  • Assurance that significant resources up front lead to significant gains in the long term.

Prior to launching, we shared our learnings from similar projects in other areas of our business and from stories shared by similar organizations. It was also helpful not just to tell them how we leverage technology but to show them examples of engaging, multimedia assessments.

4. Collect, incorporate, and showcase input from the business in every facet of development, validation, and implementation. Conversations with and feedback from the business were critical to the success of this assessment. For example:

  • We convened a steering committee with leaders from the business, from HR, and TA. We documented discussions and key decisions made by this committee.
  • Subject matter experts (SMEs) from the business informed the work by describing work requirements, what differentiates successful from unsuccessful new hires, and the rewards and challenges of the job. This was resource intensive but essential to creating content that engaged applicants and resonated with our stakeholders.
  • Managers reviewed the draft content and provided feedback before it was finalized.

Post implementation, this has helped us reinforce how the assessment is uniquely suited to CVS Health. Two years later, new managers continue to ask “what the assessment is about? How do I know it works?” We are able to tie content back to the contributions of SMEs most salient to the person asking the question (e.g., trainers in their location). We can also share exactly how a leader’s feedback informed a change in direction, and provide the supporting rationale.

5. Demonstrate the value in stakeholder language. Sometimes, the biggest learning curve after leaving school is handling the lack of enthusiasm from business leaders over phrases like “criterion-related validity” and “statistical reliability.” To be fair, this is not always the case. Leaders are metrics-oriented and appreciate significant results.  More typically, our stakeholders are interested in the following types of information:

  • What does it measure? Does it measure everything I need?
  • Is it really improving the quality of our new hires? What is the percent increase in quality [or productivity]?
  • How is it reducing turnover?
  • Is it too hard [too easy]?

We can describe how the content was created and refined based on SME input; for example, referring back to the scenarios gathered during the job analysis to show how the tool captures not only job requirements but the context of work.

All the validity and outcome data is available and we meet periodically with leaders to address their questions. But, it is critical to describe data in such a way that resonates with leaders: how the assessment differentiates job applicants and leads to improved operational outcomes.

Of course, as our performance metrics partners at Mattersight noted, data are great but only as good as it is meaningful to the people with whom we share it. It’s been important to understand the audience’s comfort level with data visualization and to tailor the message accordingly. This applies when sharing data with other analysts as well. As analysts working closely with data, it is easy to forget that it is not intuitive to third parties.

As our assessment partners at Shaker noted, the administrative efficiency gained by simply adding an assessment to a hiring funnel gets little attention in the literature relative to demonstrating posthire impact on performance and retention. The large immediate and measurable result realized in the first year of administration was the reduction in time spent sorting through less qualified candidates. Although the literature largely ignores the practical utility and ROI this provides to organizations using assessments as an early step in their process, it receives heavy attention from the highest levels of the organization.


Our biggest learning from the CCR assessment project is that building the bridge involves engaging stakeholders at the beginning and continuously working to ensure their buy-in for the long term. We did this by identifying who our stakeholders are, understanding their expectations and how they operationalize ROI, identifying business metrics and outcomes that are important to them, and translating empirically based results into metrics that resonate with them. At the end of the day, operational leaders may not be swayed by our adherence to the Uniform Guidelines but may be more influenced when you can say “we literally saved 3 years’ worth of recruiter time in the first six months of using this tool.”

We won minds by showing data that the assessment saved time, increased training to proficiency, benefited overall performance on measurable outcomes, and provided an informative and engaging candidate experience. We won hearts by showcasing the experience once the assessment was developed so our stakeholders could experience for themselves the custom nature of it and the way in which it is deeply contextualized to our business. We continue to learn more and better ways to carry out our work, but so far, we’ve found that the remaining gaps in that bridge can be smoothed out with strong project management and clear, two-way communication with stakeholders.

Calling Potential Contributors to “The Bridge: Connecting Science and Practice” 

As outlined in Poteet, Zugec, and Wallace (2016), the TIP Editorial Board and Professional Practice Committee continue to have oversight and review responsibility for this column. We are happy to announce that, starting April 2018, Kimberly Adams ( and Stephanie Zajac ( will be assuming editorial duties for this column. We invite interested potential contributors to contact them directly with ideas for columns.


We offer special thanks to Gabriel Lopez-Rivas, CVS Health, Daly Vaughn and Carter Gibson, Shaker International, and Scott Inns, Mattersight for their work on the assessment project and contributions to best practices and lessons learned in this article.


American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for Educational and Psychological Testing. Washington, DC: APA.

Equal Employment Opportunity Commission, C. S. C., Department of Labor, and Department of Justice. (1978). Uniform guidelines on employee selection procedures. Federal Register, 43(166), 38290-315.

Kang, Y. & Yu, Trevor. (2014). Person-organization fit effects on organizational attraction: A test of an expectations-based model. Organizational Behavior and Human Decision Processes, 124, 75–94.

Niedle, L.B., & Littrell, L.N. (2016, July). The bridge: Connecting science and practice. The Industrial-Organizational Psychologist, 54(1), (retrieved from

Schiemann, W.A. & Seibert, J.H. (June, 2017). Winning the HRM Evidence-Based Impact Award-Lessons learned: A conversation with key stakeholders to the process. Industrial and Organizational Psychology, 10(2), 314-326.


Previous Article The I-Opener: Earth, Wind, You’re Hired! Or: How I Learned to Stop Worrying and Love Helping Small Businesses Using I-O
Next Article On the Legal Front: You Say You Want a Revolution
3793 Rate this article:
No rating