Featured Articles
Jenny Baker
/ Categories: 574

The Bridge: Connecting Science and Practice

Kimberly Adams, Independent Consultant, & Stephanie Zajac, UT MD Anderson Cancer Center

“The Bridge: Connecting Science and Practice” is a TIP column that seeks to help facilitate additional learning and knowledge transfer to encourage sound, evidence-based practice. It can provide academics with an opportunity to discuss the potential and/or realized practical implications of their research as well as learn about cutting-edge practice issues or questions that could inform new research programs or studies. For practitioners, it provides opportunities to learn about the latest research findings that could prompt new techniques, solutions, or services that would benefit the external client community. It also provides practitioners with an opportunity to highlight key practice issues, challenges, trends, and so forth that may benefit from additional research. In this issue, Dr. Aimee Gardner describes the integration of industrial-organizational psychology principles into the surgical residency selection process to address issues with efficiency, effectiveness, and fairness. Dr. Gardner also describes challenges that arose in bringing I-O to a new field and the strategies taken to overcome them.

 

Applying Selection Science to the Surgical Community

Aimee K. Gardner, PhD

Aimee Gardner, PhD, specializes in applying traditional industrial-organizational (I-O) psychology principles related to training and development, assessment, and selection to the healthcare field. Her work has been published in over 100 peer-reviewed articles and book chapters. Since completing graduate school (University of Akron, 2013), she has held roles in medical simulation centers, clinical departments, and currently serves as an assistant dean at Baylor College of Medicine in Houston, Texas.

While serving in these roles, Gardner quickly observed a significant gap between the research base and professional guidelines developed within I-O psychology and healthcare selection. Although applying the science of selection is important to optimize the efficiency and effectiveness of any organization, this integration is even more critical in high-stakes industries like healthcare. These gaps are concerning not only because healthcare-hiring organizations may be spending valuable time and resources on ineffective selection systems but also because healthcare is a field in which placing the wrong person in the job can have a widespread impact on patient care, clinical outcomes, and society at large.

To bridge these gaps and spread knowledge of I-O science to the medical education community, Gardner joined forces with a practicing surgeon in 2016 and founded SurgWise Consulting, a boutique, interdisciplinary consulting firm that specializes in integrating I-O research and principles into the selection and assessment of surgeons. SurgWise helps hospitals implement more evidence-based selection techniques for identifying those best fit for a career in surgery through multimethod competency modeling, deployment of customized screening assessments, and incorporation of structured interviews.

What’s the Problem?

Each year, over 65,000 medical students apply for a residency position in the United States (AAMC, 2018). Residency training is a necessary step for the newly minted MDs to receive focused training in the field of their choosing (pediatrics, dermatology, surgery, etc.) and be able to obtain board certification to practice independently within their specialty. A substantial number of residents then go on to another level of specialty training through fellowships. For example, 80% of general surgery residents continue on to fellowship programs, gaining more focused experience in areas such as pediatric surgery, cardiovascular surgery, and critical care (Lewis & Klingensmith, 2012). Unfortunately, these transitions from medical school to residency and from residency to fellowship can be a substantial headache for both the hiring hospitals and the organizations themselves. Below we describe some of the outcomes and areas of opportunity with the current selection processes in regard to efficiency, effectiveness, and fairness.

Efficiency

For the majority of medical specialties, there are substantially more applicants than positions available, leading applicants to apply to a wide array of programs across the country to increase their chances of securing a position within their specialty of choice. For many competitive specialties, such as surgery, this means that applicants are applying to over 50 different programs across the United States (AAMC, 2018). Unfortunately, that high volume creates a substantial burden on the hiring programs themselves. For example, surgery residency programs receive an average of around 800 applications to fill just five positions each year (AAMC, 2018). Although the application packets consist of traditional materials, such as medical-school grade point average, personal statements, letters of recommendation, and performance on national licensing examinations, the majority of decision makers agree that these data are not helpful in identifying those best-fit candidates for a rigorous, lengthy, and demanding career in surgery. However, without other usable information to filter the large volume of applications down to a more manageable size, program directors will apply a numeric cutoff of their choice to the one common metric across all applicants (licensing-examination scores), will manually review the remaining paper files for favorable letters and convincing personal statements, and will invite a number of the remaining candidates (often 100+) for on-site interviews (NRMP, 2018). Surgeons within each hospital will then close clinics, cancel their operating room schedule, and pause their other responsibilities to spend time with each candidate when they come on site for hospital tours and one-on-one interviews, 95% of which are unstructured (Kim et al., 2016).

As you might imagine, this process can be substantially burdensome for both the candidates and the clinicians themselves, along with those patients who are on the receiving end of delayed appointments and potential cancellations. It is estimated that medical students applying to surgery spend up to $12,000 on the application process and travel to attend up to 20 on-site interviews, missing weeks of their final year of medical training (AAMC, 2015). In other specialties, such as ophthalmology, that number reaches up to $20,000 per applicant! The hiring hospitals also take on a significant financial burden during this selection process. We recently partnered with a national surgical society to put a cost on the average amount of time and resources each program spends on the screening-and-selection process each year. We found that the average surgery residency program, seeking to fill just five positions, conservatively spends an average of $100,000 each year trying to identify those best-fit candidates (Gardner et al., 2019). Extrapolated to the ~280 programs going through this process each year, the specialty as a whole is spending around $6 billion dollars each year trying to identify which applicants will make the best surgeons.

Effectiveness

Likely not surprising to anyone with a background in I-O, the ability for a lot of these screening tools and processes to predict later performance as a surgeon in training or practice is quite weak. Researchers have tried to tie application packet materials and interview ratings to a wide array of performance outcomes relevant to surgeons—such as performance in the clinic, procedural performance, awards received, patient satisfaction, need for remediation, and performance on later board certification examinations—but to no avail (Fryer et al., 2012; Mainthia et al., 2014; Stohl et al., 2010; Sutton et al., 2014). Perhaps even more concerning is the inability to identify who will even complete training and have long-lasting careers as surgeons. Surgery is a specialty with especially high attrition, with around one-third of all trainees leaving the program, many of whom exit within the first year or 2 (Yeo et al., 2017). For those who do complete training, around 30% require at least one formal intervention to remediate nontechnical skills such as professionalism and interpersonal skills (Yaghoubian et al., 2012). In sum, there is ample opportunity to identify and implement tools and processes that will help these organizations identify those who will meet the demands and thrive in a career in surgery.

Fairness

Surgery is a profession that has traditionally been dominated by White men. Only recently have national campaigns been developed to draw attention to the racial and gender disparities present within the profession as a first step in ameliorating them. Unfortunately, it is unlikely that these efforts will fully take off until the current pipeline is rid of structural biases and inequitable screening practices. For example, the United States Medical Licensing Examination (USMLE) has been identified as a substantial barrier for entry into medical practice for underrepresented groups (Edmond et al., 2001; Rubright et al., 2019). Despite the paucity of data linking examination performance to later performance in training or practice (McGaghie et al., 2011), and the test developers’ recommendations that it not be used for selection (Katsufrakis & Chaudhry, 2019; Prober et al., 2016), the score on this examination is cited as the most important factor in identifying whose application will be further reviewed by hiring organizations. Other influential application information, such as letters of recommendation, have been criticized for their gender bias and favoritism of those applicants with an “insider status” and connections with those influential surgeons in the field (Higgins, 2017; Turrentine et al., 2019). Finally, reliance on unstructured interviews has created a screening process that applicants must strategically navigate, with recent data indicating that over 80% of all applicants to surgery receive at least one inappropriate or illegal interview question (Hern et al., 2016).

The Birth of SurgWise    

The fact that surgeons are spending hours away from their patients and families to meet one on one with hundreds of candidates, are relying upon screening tools with little validity for predicting future performance, and are simultaneously trusting unstructured processes susceptible to bias—while also trying to increase the diversity of the workforce—is likely striking for both researchers and practitioners within I-O. After discussing these issues with physician friends at other institutions, I quickly learned that I was not the only one exasperated with the current process. Although it was rare to find someone who was familiar with the field of I-O and how it could contribute, my colleagues across the country were eager to find a better way to find those best fit for a physician role in their hospital, and they were up for experimentation. Through these conversations, I learned about a wide array of new “techniques” being implemented to help programs identify those best-fit candidates for surgery careers, including requiring applicants to play the game of Operation® to demonstrate their stress management skills, arm wrestle an interviewing surgeon to test their hand strength, and put together Potato Head “families” to assess attention to detail. Thus, it was clear that there was at least a small proportion of programs across the country willing to “experiment” by integrating the science of I-O into their selection system.

However, I knew that to be successful in any effort to revamp the selection process, I would have to be strategic. The surgeon community is not always keen on “outsiders” entering their space and telling them how things should be done; they are much more likely to listen to a colleague with whom they have shared experiences. Fortunately, around this time I was working on a number of research projects with a nationally known and well-respected surgeon who had also expressed significant pain selecting the right candidates for his own program. After discussing some of the pain points with him and informing him about a field called I-O psychology that can help provide some guidance on these issues, we decided to partner up and pilot a more evidence-based selection process at his institution. We were delighted to observe a number of benefits (increased efficiency, less bias, selection of more diverse candidates, etc.) after just the first year of implementation—so much so that we agreed that there were a number of other organizations across the country that could benefit, and that we should formalize our partnership and processes even further through the development of SurgWise.

Lessons Learned and Additional Opportunities

Integrating I-O into a new field has brought with it a whole host of lessons learned and opportunities for further research and practice. The table below provides just a sample of many of the challenges we have experienced over the past few years, along with our attempts to overcome them through strategic partnerships, communication strategies, education, and research. 

Challenge

How we have attempted to overcome them

Achieving buy-in as an “outsider”

  • Enlisted surgeon as cofounder to help represent company and “sell” I-O techniques
  • Including personal “pain” stories in communications with potential clients
  • Have our clients present their success stories independent of us
  • Strategically partnering with influential organizations and societies within the profession

Data outside the industry are not convincing

  • Replicating well-established I-O studies within the medical field
  • Presenting and publishing at national meetings and in professional journals in the medical field
  • Bringing in data from other respected high-stakes industries (military, aviation)

Even with a high applicant-to-hire ratio, programs are very fearful a new process will “scare away the good ones”

  • Collect test-abandonment and applicant drop-out rates
  • Provide reports of characteristics of those who choose not to participate in the selection system
  • Incorporate applicant participation metrics into consulting contracts

Little fear of litigation

  • Provide education on the various points throughout the selection system that could be scrutinized
  • Emphasize societal and ethical issues with implementing unequitable selection processes
  • Tying legal responsibility within selection to other well-understood legal concepts (medical litigation)

Significant reliance on intuition and experience in making selection decisions

  • Review and organize historical selection documentation to identify prior remediation and attrition data
  • Performing case studies with the client to identify impact of traditional versus new selection system
  • Incorporate standardized training programs to highlight impact of relying solely upon intuition or instinct
  • Connect clients with independent I-O researchers to explore historical patterns such as stereotypicality, bias in letters of recommendation, etc.

 

Acknowledging these obstacles, we have had many wins and success stories throughout this process. Thus far, around 70-80% of all applicants to surgery training programs complete at least one of our assessments. Because of this reach, and our mission to create more evidence-based and equitable pathways for entry into the profession, we have been able to move the needle forward in helping the profession reach its workforce diversity goals (Gardner et al., 2019). By de-emphasizing reliance on screening tools and practices that we know can systematically disadvantage individuals from underrepresented groups, and instead relying on tools with a strong evidence base and that meet professional guidelines, we have been able to enhance the diversity of the field in just a few short years (Gardner et al., 2019). Even more, we have helped organizations cut down on the number of interviews they conduct, giving their surgeons more time to spend taking care of patients, working on their academic pursuits, and spending time on activities that help them achieve their work–life integration goals.

Fortunately, there is a lot more to be learned within this industry, and it is ripe for research. For example, the centralized application and selection process for all candidates pursuing this career pathway means that we can study the same population of applicants to multiple organizations, assessing an array of application behaviors such as response consistency, faking, and drop-out tendencies. Furthermore, the multi-dimensional nature of surgeon performance means that there is a wide array of outcomes to explore in measuring the effectiveness of a selection system, such as procedural performance, licensing-examination performance, patient care scores, and numerous nontechnical competencies. Finally, as all programs within a given specialty are required to provide similar performance data of those selected, we can review uniform performance metrics across numerous organizations and follow the performance of the “lost tribe”—those who were rejected by one organization but selected into another across the country.

Conclusion

In sum, it has been very rewarding (and a lot of fun!) to help a high-stakes industry solve some of their talent management issues through integration of I-O science and principles. The complexities of the healthcare training-and-hiring system present exciting opportunities for I-O researchers and practitioners alike.

References

Association of American Medical Colleges. (2015). Cost of Applying to Residency Questionnaire report.  https://www.aamc.org/download/430902/data/costofapplyingtoresidency.pdf.

Association of American Medical Colleges. (2018). Surgery-general (categorical). https://www.aamc.org/system/files/reports/1/362130-generalsurgerycategorical.pdf

Edmond, M., Deschenes, J. L., Eckler, M., & Wenzel, R. P. (2001). Racial bias in using USMLE step 1 scores to grant internal medicine residency interviews. Academic Medicine, 76, 1253–1256. DOI: 10.1097/00001888-200112000-00021

Fryer, J. P., Corcoran, N., George, B., Wang, E., & DaRosa, D. (2012). Does resident ranking during recruitment accurately predict subsequent performance as a surgical resident? Journal of Surgical Education, 69, 724–730.

Gardner, A. K., Cavanaugh, K. J., Willis, R. E., & Dunkin, B. J. (2019). Can better selection tools help us achieve our diversity goals in postgraduate medical education? Comparing use of USMLE step 1 scores and situational judgment tests at 7 surgical residencies. Academic Medicine. DOI: 10.1097/ACM.0000000000003092.

Hern, H. G., Trivedi, T., Alter, H. J., & Wills, C. P. (2016). How prevalent are potentially illegal questions during residency interviews? A follow-up study of applicants to all specialties in the national resident matching program. Academic Medicine, 91, 1546–1553.

Higgins, R. S. D. (2017). Inside baseball: Leveling the playing field in the surgical residency selection process. Annals of Surgery, 267, 1.

Katsufrakis, P. J., & Chaudhry, H. J. (2019). Improving residency selection requires close study and better understanding of stakeholder needs. Academic Medicine, 94, 305–308. DOI: 10.1097/ACM.0000000000002559

Kim, R. H., Gilbert, T., Sookyung S., Miller, J. K., & Eggerstedt, J. M. (2016). General surgery residency interviews: Are we following best practices? American Journal of Surgery, 211, 476–481.

Lewis, F. R., & Klingensmith, M. E. (2012). Issues in general surgery residency training—2012. Annals of Surgery, 256(4), 553–9.

Mainthia, R., Tarpley, M. J., Davidson, M., & Tarpley, J. L. (2014). Achievement in surgical residency: Are objective measures of performance associated with awards received in final years of training? Journal of Surgical Education, 71, 176–181.

McGaghie, W. C., Cohen, E. R., & Wayne, D. B. (2011). Are United States Medical Licensing Exam step 1 and 2 scores valid measures for postgraduate medical residency selection decisions? Academic Medicine, 86, 48–52. DOI: 10.1097/ACM.0b013e3181ffacdb

National Resident Matching Program (NRMP). (2018). Results of the 2018 NRMP Program Director Survey. https://www.nrmp.org/wp-content/uploads/2018/07/NRMP-2018-ProgramDirector-Survey-for-WWW.pdf

Prober, C. G., Kolars, J. C., First, L. R., & Melnick, D. E. (2016). A plea to reassess the role of United States Medical Licensing Examination step 1 scores in residency selection. Academic Medicine, 91, 12–15. DOI: 10.1097/ACM.0000000000000855

Rubright, J. D., Jodoin, M., & Barone, M. A. (2019). Examining demographics, prior academic performance, and United States Medical Licensing Examination scores. Academic Medicine, 94, 364–370. DOI: 10.1097/ACM.0000000000002366

Stohl, H. E., Hueppchen, N. A., & Bienstock ,J. L. (2010). Can medical school performance predict residency performance? Resident selection and predictors of successful performance in obstetrics and gynecology. Journal of Graduate Medical Education, 2, 322–326.

Sutton, E., Richardson, J. D., Ziegler, C., Bond, J., Burke-Poole, M., & McMasters, K. M. (2014). Is USMLE step 1 score a valid predictor of success in surgical residency? American Journal of Surgery, 208, 1029–1034.

Turrentine, F. E., Dreisbach, C. N., St. Ivany, A. R., Hanks, J. B., & Schroen, A. T. (2019). Influence of gender on surgical residency applicants’ recommendation letters. Journal of the American College of Surgeons, 288, 356–365.

Yaghoubian, A., Galante, J., Kaji, A., Reeves, M., Melcher, M., Salim, A., Dolich, M., & de Virgilio, C. (2012). General surgery resident remediation and attrition. Archives of Surgery, 147, 829–833.

Yeo, H., Abelson, J., Mao, J., Lewis, F., Michelassi, F., Bell, R., Sedrakyan, A., & Sosa, J. A.  (2017). Who makes it to the end? A novel predictive model for identifying surgical residents at risk for attrition. Annals of Surgery, 266, 499–507.


Print
1908 Rate this article:
No rating
Comments are only visible to subscribers.

Categories

Information on this website, including articles, white papers, and other resources, is provided by SIOP staff and members. We do not include third-party content on our website or in our publications, except in rare exceptions such as paid partnerships.