LEC 2019 Banner 1500px

October 24


8:30 - Noon

Optional workshop (see below for details)

1:30 - 5:00

Optional workshop (see below for details)

7:30 - 9:00pm

Welcome Reception (sponsored by BTS) on The Terrace

Day 1
October 25


Click Here to read the learning objectives for Friday

7:30 - 8:30



Welcome and Introduction


Darko Lovric (Incandescent): Taking a Venture Capitalist Perspective on People Analytics

New technologies are radically changing what we can know about how people engage with their work and each other.  Assessment in many forms generate data from which a broad range of analytics are emerging that offer promise for generating new insights; these insights, in turn, may help organizations gain advantages.   Many of these technologies are supported via a venture ecosystem. How are VCs and accelerators evaluating the potential of these technologies and making their decisions? How does venture capitalism (VC) impact the design, validity and usability of these assessments? How does VC inform psychologists’ selection and use of these assessments? What can this tell us about the future focus of people analytics? In the fight of usability vs effectiveness, or science vs belief, who wins?  


Advances & Trends (Part 1)

  • Nathan Mondragon (HireVue) - Whole Candidate Evaluation With AI: Prediction of Job Success without the Bias

In this presentation, analysis of Video Interview answers and simulation behaviors will be used to describe the way AI methods (e.g., Natural Language Processing, Semantic Analysis, and Emotion Categories) are used to generate competency profiles for job applicants. We will present construct validation evidence of these psychological measures along with criterion validation evidence from a variety of studies. Finally, a discussion of AI and bias will be provided and how algorithmic bias can be mitigated if planned for and monitored.

  • Robert Gibby (IBM) - AI-Based Candidate and Recruiter Job Matching

Natural language processing, machine learning and other Artificial Intelligence capabilities are allowing us to understand and integrate data (such as candidate characteristics and skills) in new ways to inform candidates, recruiters and hiring managers. I will highlight how these capabilities are being successfully designed and leveraged to match candidates to jobs.

  • Daly Vaughn (Modern Hire) - Social Media and Selection

As Social Media (SM) applications have become integrated into the daily life of billions, organizations are exploring how technology enables examination of candidate SM data. To optimize appropriate use of social media within the boundaries of federal regulations, it is imperative that HR practitioners and researchers guide appropriate SM usage in selection. This session will discuss trends and updates from research and practice.


Break (sponsored by EchoSpan)


Advances & Trends (Part 2)

  • MQ (Mengqiao) Liu (Amazon) - Making Sense and Predictions from Unstructured Data in Assessment

Eighty percent of all enterprise data are unstructured, with less than 1% analyzed. This session will describe the state-of-the-art, use cases, and challenges of natural language processing (NLP) techniques and discuss how organizations are using NLP as part of assessment processes to uncover insights that were previously hidden using traditional methods.

  • Ken Lahti (SHL) - Innovations in Mobile Assessment of Talent

Organizations and users expect all HR technologies to be mobile accessible – including assessment. However, mobile conversions of traditional assessments have resulted in non-equivalent measures and/or longer administration times. This presentation will cover recent noteworthy advances in the evolution of psychometric technology achieved by focusing on user experience and mobile-first design.

  • Seymour Adler (Kincentric) - Technology and Simulation Design: Opportunities and Cautions

While the core science underlying effective use of simulations in assessment remains unchanged, technological changes have radically altered simulation design, delivery, and the participant experience. We will address these changes, project trends, and explore the implication of the evolving future of work for how we design and apply simulations.


Reflections on Advances & Trends

  • Richard Landers (University of Minnesota)

Dr. Landers will synthesize the technological and methodological advances for assessing talent and evaluate the state of the research and adoption of advanced assessment methods in assessment programs. He will also review key considerations when determining whether and how to apply new assessment methodologies to assessment programs.


High Impact Practitioner Talks

  • Suzanne Tsacoumis (HumRRO) - High-stakes, rich-media simulations: Practical applications.

Organizations that embrace and incorporate rich media into their operational testing practices are transforming the assessment landscape. Suzanne will provide a tour of the most complex, and potentially controversial, applications of rich media as she describes selection programs that use highly immersive simulations that allow the test taker to “interact” with the assessment.

  • Ben Hawkes (Shell) - Artificial Intelligence: What Lies Between the Hype and the Hate?

Aimed at assessment professionals who are more used to evaluating ‘traditional’ psychometric techniques, this session will present a practitioner’s toolkit: a concise set of methods to evaluate the utility and defensibility of AI-based selection approaches – from the stage of ‘kicking the tires’ through to implementation and post-launch monitoring.




The Commercialization of Assessments: Managing the Risks 

  • Adam Klein, Esq., Outten & Golden LLP
  • Ken Willner, Esq., Paul Hastings, LLP

This discussion will draw upon the perspectives of both plaintiffs' and defense counsel to explore risks associated with the rapid evolution of assessments for talent acquisition. The diversity of viewpoints will provide a rich appreciation for how to evaluate and manage the challenges associated with implementing innovative assessment technologies.


Regulations/Principles & Standards Update

  • Nancy Tippins (The Nancy T. Tippins Group) - Update on SIOP Principles Revision

This session reviews the major changes in the fifth edition of the Principles for the Validation and Use of Personnel Selection Procedures that were stimulated by the recent update to the Standards for Educational and Psychological Testing.

  • Jenny Yang (Urban Institute) – A Perspective on Equity and the Algorithmic-Driven Hiring Process

As more employers turn to AI-driven algorithmic screens to assess, score, and rank applicants – algorithms have become gatekeepers to economic opportunity. Critical questions remain about how we adapt our laws to apply to rapidly advancing technology and ensure that our employment processes are designed in a responsible way that promotes equal opportunity.

  • Kate Andresen, Esq. (Nilan Johnson Lewis) - Does GDPR compliance mean we're covered for any data privacy laws?

Under the General Data Protection Regulation (GDPR) organizations need to comply with: (i) individual rights; (ii) protecting the privacy of the personal information; and (iii) employing appropriate security measures.  Will doing so mean organizations are compliant with other privacy laws?  This presentation will focus on data privacy and security in the context of workplace assessments. 


Reflections on Legal & Standards Update

  • Kathleen Lundquist (APTMetrics)

Dr. Lundquist will review how selection and assessment programs are being influenced by critically important regulatory and legal changes and summarize the impact of data privacy laws on the future of assessment programs.  She will also explore the risks associated with the rapid evolution of assessments for talent acquisition.


Break (sponsored by EchoSpan)


Speed Benchmarking (Emily Solberg, Jerilyn Hayward, Charles Handler, Daly Vaughn)


High Impact Practitioner Talk Charles Scherbaum (Baruch) - Using Psychological Assessments to Predict Player Performance in the NFL

As documented in the popular book Moneyball (Lewis, 2003), professional sports teams have dramatically shifted to using sports analytics to predict the success of athletes. While teams use a wide range of data to do this, they have not fully exploited standardized psychological assessment in their analytical models. In fact, many have questioned whether psychological assessments are useful for predicting success in the unique context of professional sports. This presentation describes the Player Assessment Test (PAT) which was implemented by the NFL in 2013 for all players attending the NFL Combine from 2013-2019. Results of a multi-year validation study have demonstrated that the PAT significantly correlates with a number of player performance outcomes. This presentation reviews these results and discusses how psychological assessments like the PAT provide critical data that can be used in a variety of college and professional sports to predict performance outcomes as well as more traditional organizational settings.


Thomas Dimitroff (General Manager, Atlanta Falcons), Dean Stamoulis (Russell Reynolds Associates), Harold Goldstein (The Graduate Center & Baruch College, CUNY Psychology Department) - An NFL General Manager’s View on Assessment and Talent

NFL teams have started to place more emphasis on non-physical characteristics when selecting athletes. Thomas Dimitroff, one of the longest tenured General Managers in the NFL, will be interviewed by Drs. Stamoulis & Goldstein to address how the Atlanta Falcons front office approaches player assessment. In addition, implications for the practice of industrial/ organizational psychology will be discussed.


LEC Committee Chairs      Preview Day 2

Social Hour (sponsored by APTMetrics) in Ellington Prefunction


Networking Dinners (Meet in Ellington Prefunction by 5:45 PM)

Day 2 
October 26


Click here to read the learning objectives for Saturday

7:00 - 8:00



Welcome, Opening


Robert Hogan (Hogan Assessment Systems) - Personality Measurement: Yesterday, Today & Tomorrow

Personality drives leadership, and leadership drives organizational effectiveness; ultimately, then, personality determines the fate of organizations. Writers as varied as Max Weber, Sigmund Freud, and Peter Drucker have made the same point. Modern research in personality assessment makes it possible to quantify this truth, an empirical fact with huge practical consequences.


Advances in the Science of Assessment

  • Fritz Drasgow (University of Illinois) - Fake Resistant and Efficient Personality Assessment

Interest in temperament/personality as a predictor of performance has been galvanized by a rapidly growing body of empirical evidence that shows temperament constructs predict performance across a diverse array of civilian and military occupations. The Tailored Adaptive Personality Assessment System (TAPAS), and its derivative cousin the WorkFORCE® Assessments, represent a new generation of temperament measures that 1) are fake-resistant, 2) utilize computer adaptive technology to measure well across a broad range of trait continua, and 3) are easily customized to meet the needs of many civilian and military organizations. These instruments use a two-alternative forced choice format, with pairs of statements selected from different facets of personality. The statements are constrained to be similar in extremity and social desirability, which reduces faking. Responses are scored with the Multidimensional Pairwise Preference (MDPP) item response theory model, which has been shown to eliminate the problem of ipsativity. Statements are paired adaptively to maximize psychometric information; simulation studies have found that a 50% reduction in test length is possible with no loss of reliability. Facets can be selected from a comprehensive set of 22 facets that underlie the Big Five personality dimensions. Validity studies have found that TAPAS predicts attrition and “will do” aspects of performance.

  • Adam Meade (North Carolina State University) – Rapid Response Measurement: Reliable, Faking-Resistant Measurement in Less Than 20 Seconds

Rapid Response Measurement is a method in which stimuli (e.g., personality adjectives) are presented on screen one at a time and are responded to with a dichotomous response. Reliable measurement takes around 10-20 seconds per construct. Validity and faking resistance evidence will be presented, including correlations with job performance.

  • Scott Tonidandel (UNC-Charlotte) – Deriving meaning from unstructured text: Assessing leader challenges using structural topic models

Over 8,000 participants in a leadership development program reported on the most significant challenges they face as a leader or manager. Using that unstructured text data, demonstrate how structural topic models can be applied to assess the challenges faced by leaders and managers and explore how challenges covary with leader and manager demographics.


Neal Schmitt (Michigan State University) - Reflections on Advances in the Science of Assessment  

Dr. Schmitt will reflect on the scientific findings regarding the efficacy of the new measurement approaches and facilitate discussion on key considerations required when implementing these new approaches within organizational contexts and assessment programs. The purpose of this session is to draw connections between the presentations and demonstrate how talent management professionals and HR leaders can apply these learnings to advance the quality of talent assessment
decisions in their organizations.


High Impact Practitioner Talk

  • Tatana M. Olson (United States Navy) - Finding the “Right Stuff”: How Advances in Neuroscience and Biotechnology Enhance Selection for Extreme Work

What does it mean to have the “right stuff?” High-risk environments, including those experienced in military operations and law enforcement, expose personnel to extremely demanding physical and psychological stressors. This presentation will discuss emerging efforts within these domains to enhance more traditional personnel selection methods leveraging advances in neuroscience and biotechnology.

  • Christina Norris-Watts (Johnson & Johnson) - Demystifying Gamified Assessment: Real-World Strategies and Lessons Learned  

Recently, we’ve seen significant innovation in pre-hire assessments. These new assessments are both exciting and daunting. In this presentation I will share real-world learnings on how to explain gamified assessments to business audiences. I will also provide practical strategies for I/Os looking to implement, or learn about, gamified selection assessments.


    Break (sponsored by EchoSpan)


Speed Benchmarking (Emily Solberg, Jerilyn Hayward, Charles Handler)


Can Assessment Help Us Grow: Linking Assessment to Development

  • Jessica Parisi (BTS) - High-Fidelity Assessments, Top Performers and Transformers 

​​We asked our client to explain their #1 shareholder value creation for 30+ years across the S&P 500. Their answer: “We have the best leaders in the industry. Period.” Top leadership drives growth. Jessica will share the ingredients of assessments combined with tightly aligned support that impact company growth. She will address the ingredients for successful simulation development that are aligned to people’s evolving values and the consumption of software, including authorship, practical and scaled, use of “great” as a standard, and reflecting today’s way of working. Jessica will also share specific assessment tools and development journeys and discuss the individual and organizational outcomes achieved.

  • Erin Laxson (Hogan) - Using Personality Assessments to Guide Development of Effective Leaders  

An organization’s success is based on the effectiveness of its leaders. Organizations invest in leadership development activities, which often include personality assessment. In this session we will discuss how personality assessment can help organizations determine who to invest in and the linkage between personality and the indicators of successful development.




Can Assessment Help Us Grow?  (Part 2)

  • Sarah Stawiski (CCL) - Bringing on the Proof of Assessment-Intensive Development Programs

Feedback via formal assessments and other means is a critical element of leader development. This session will focus on how feedback-intensive development programs are designed and implemented to help leaders grow and provide findings about their effectiveness from more than a decade of measuring results.

  • Evan Sinar (Better Up) - Coaching-Centric Assessments: Deep Data to Fuel Guided Growth 

By integrating assessment data with coaching relationships, organizations can draw on the complementary advantages of each to drive individual growth and organizational value. Assessment-infused coaching uniquely blends standardization with personalization, and operational efficiency with participant engagement. This presentation overviews principles and evidence-based recommendations for development-centric assessments applied to coaching practices.     


Facilitated Discussion on Assessment for Development Jose David - Facilitator (Merck)

Dr. David will draw together the key themes and insights across the presenters in this section and facilitate a discussion among the presenters on how assessment tools and the data they generate can be used to drive personal and organizational growth.  He will also reflect upon strategies to increase the value of developmental assessments for organizations when making talent investment decisions.  The purpose of this talk is to draw connections between the presentations and demonstrate how talent management professionals and HR leaders can apply these learnings to advance the quality of talent development decisions in their organizations.


High Impact Practitioner Talk Matthew Dreyer (Prudential Financial) - The Full (Assessment) Monty: Accelerating Individual and Organizational Development Accelerating Individual and Organizational Development

Business simulations foster development by creating rich scenarios to develop new thinking and behaviors. They also provide a unique opportunity to assess participants. Learn how Prudential is leveraging simulation in conjunction with traditional assessment, observation, feedback, and individual and pod coaching to accelerate readiness of participants for higher level roles.


Paul Sackett (University of Minnesota) - Putting the Pieces Together: Reflections on the Next Chapter of Assessment Progress

For this final presentation, Dr. Sackett assumes a discussant role, integrating ideas from the various presentations to offer a perspective on the future of assessment. He will prepare in advance a series of propositions regarding the future of assessment, and then discuss how these propositions mesh with or are challenged by ideas presented at the conference.


Closing Summary /Departure     Debrief with Audience

Preconference Workshops – October 24th

You may register for one or two. AM Session: 8:30am-noon; PM Session: 1:30-5:00pm


Click here to read the workshop learning objectives


AI x AI: Let's Talk Assessment Innovation and Artificial Intelligence Technology advances, including artificial intelligence and gaming, are creating new possibilities for more engaging, efficient and predictive assessments. This workshop will focus on innovations for screening, cognitive ability and soft skill assessments as well as applications for interviews. The workshop will go beyond the demos and user experiences to unpack the science and analytics underlying these next generation assessments to help attendees make informed decisions.

Robert Gibby (IBM), Faisal Ahmed (Knockri), Matthew Neale (Revelian)

What do we know, think we know, and know we don't know?  Insights from the latest multidisciplinary research on assessment This workshop will highlight research from multiple disciplines relevant to the assessment space.  The workshop will provide a brief update on recent research for "Old School Assessment" tools and approaches, as well as dive into the latest research on top assessment trends, such as: OnDemand Video Interviews, Multimedia Simulations, Modularization, Gamification, Mobile, etc., and discuss the impact of research in other disciplines.  Besides summarizing what is new, the workshop will identify key gaps in our knowledge and discuss how multidisciplinary research can help broaden our thinking about the future of selection.

Ann Marie Ryan (Michigan State University), Anthony Boyce (Amazon)

Validation Meets Innovation: Doing Selection Right in the 2020s! This workshop will explore issues related to job analysis, test development, criterion development, and validation as they apply to new assessment tools (e.g., selection procedures based on artificial intelligence, facial recognition, performance in games) that are emerging in our field.  The content of this workshop is designed to refresh practitioners’ skills, expand their knowledge base, and highlight areas in which best practices have not yet been defined and will be grounded in existing legal and professional guidelines. 

Nancy Tippins (The Nancy T. Tippins Group), Fred Oswald (Rice University), Mort McPhail 

Please note: Lunch is not provided with workshop registration. However, a few informal groups are organizing and workshop participants are encouraged to join in the fun. Please click HERE to sign up.