Home Home | About Us | Sitemap | Contact  
  • Info For
  • Professionals
  • Students
  • Educators
  • Media
  • Search
    Powered By Google

 

Program Highlights

 

Alliance Sessions

Awards Sessions

Communities of Interest

Executive Board Block

Friday Seminars

Invited Addresses

Master Collaboration

Theme Track

 

Master Collaboration: Technology and Assessment: Research Gaps, Best Practices, and Future Agenda

 

Friday, May 16, 2014
9:00-10:30am
Room 304 B

Chair: Hailey Herleman
Committee: David Mayfield, Lauren Blackwell, Nathan Bowling, and Gary Giumetti

The Master Collaboration brings together a range of leading practitioners and academics focused on technology and assessment to share the state of the science and practice, to identify gaps, and outline opportunities for collaboration in the future. This session will appeal to practitioners and academics looking to bridge the gap between good science and the frontiers of technological implementation.

Continuing Education: Participants who attend the Master Collaboration session are eligible for 1.5 continuing education credits. This session is intended for a general audience with a graduate-level foundation in psychology; prior coursework or experience in selection and assessment would be helpful but no specific prior knowledge is required. There is no additional cost to attend beyond the cost of basic conference registration. SIOP is approved by the American Psychological Association to sponsor continuing education for psychologists. SIOP maintains responsibility for this program and its content. Additionally, SIOP is an HR Certification Institute Approved Provider.

Abstract: From assessment centers to SJTs to serious games, technological advances are evident across the field of assessment.  This session brings together leading practitioners and academics focused on technology and assessment to share the state of the science and practice, to identify gaps, and outline opportunities for collaboration in the future. 

Learning Objectives:

  1. Identify gaps between research and practice in assessment technology, especially simulations.
  2. Recognize academic and applied concerns related to use of these techniques.
  3. Describe practical issues related to implementation and strategic business planning of assessment technology.
  4. Discuss a future agenda for research avenues related to technology-based simulations for selection

Presentations and Speaker Biographies

Assessment & Technology:  Till Death Do Us Part
Scott Bryant, Development Dimensions International (DDI)

The speaker will survey the many ways technology is being used to enhance assessments, with special attention to simulations. Benefits and pitfalls of the reliance on technology as well as applied and future research avenues will be discussed.

Scott Bryant is Manager of Consulting Services for Development Dimensions International (DDI) in Atlanta, GA. Scott leads a team of consultants who support clients in the Talent Management space including talent acquisition, talent development, and talent deployment. Scott’s consulting experience spans an exceptionally wide industry spectrum that includes extensive experience in manufacturing, financial services, technology, media, healthcare, and pharmaceuticals. His 18 years of applied experience includes both internal and external roles involving job analyses, competency modeling, assessment design and implementation, leadership development, executive assessment, and succession management. He also serves as an executive coach, working with individual leaders to support their individual development and growth plans. He has published research related to job analysis, psychological assessment, and leadership, and has spoken at numerous national human resources conferences.  Scott holds a Doctor of Philosophy degree in Industrial and Organizational Psychology from the University of Southern Mississippi.

Researching Technology and Assessment: Then, Now, What’s Coming Next
Mark Frame, Middle Tennessee State University

Technology-Enabled Assessment Center (TEAC) methods have changed the way assessment processes are developed and administered. Unfortunately, little research is publically available to help guide best practices. Dr. Frame will discuss research of candidate perceptions of TEAC methods, and differences between TEAC and traditional AC methods and assessors, using his work with Fenestra, Inc. as an example of how such research could be conducted.

Mark Frame is an Associate Professor of Psychology at The Middle Tennessee State University (MTSU). He was the 2004 recipient of the prestigious American Society of Training and Development Dissertation of the Year Award. Dr. Frame graduated with a B.A. from The University of New Orleans and earned both his M.S. and Ph.D. at The Illinois Institute of Technology (IIT) Industrial and Organizational Psychology Program, where his mentors included Roya Ayman, Ph. D. and the late Nambury S. Raju, Ph. D. His background includes multiple assessment related activities. While completing his education he worked for Personnel Decisions International (PDI), Personnel Research Associates (PRA), and The New Orleans Civil Service Test and Validation Unit. As an independent consultant, he has worked with Microsoft People Research, Organizational Strategies Incorporated (OSI), Ameritech Human Resources Research and Process Development, The Institute for Business and Technical Careers at The Richard J. Daley College, The Manufacturing and Productivity Center at IIT, PDI and PRA. He has also served as an assessor for Saville Holdsworth Limited (SHL), MICA Management Resources, PriceWaterhouseCoopers (PWC), IIT's  Center for Research and Service, Leadership Worth Following, L.L.C. (LWF) and the MTSU Center for Organizational and Human Resource Effectiveness (COHRE). Dr. Frame has also served as an independent expert for The City of Dallas Attorney's Office in a matter relating to challenges made to the City’s Civil Service assessment center-based promotional process.  Dr. Frame has been awarded fixed price research contracts with International Human Performance Corporation, LWF, and the JCPenney Company, Inc.  Presently, he leads a research lab investigating the area of individual assessment and assessment centers. He is conducting laboratory based assessment research as well as collaborative applied research with Fenestra, Inc.  Dr. Frame has been published in the Journal of Applied Social Psychology and the European Journal of Social Psychology and has made contributions to the texts Employee Assistance Programs: Wellness/Enhancement Programming, the Encyclopedia of Industrial and Organizational Psychology, and the Encyclopedia of Leadership.

Advancements in Assessment Technology: Bringing Better Experiences to Candidates
Ben Hawkes, Kenexa, an IBM Company

A review of existing research will demonstrate that simulation-based assessments can offer increased validity, greater differentiation of their recruitment process and a more positive candidate experience. In addition, the speaker will discuss how technological advances have lowered the development cost of simulations, and at the same time have given more candidates access to capable PCs/ mobile devices and broadband connectivity.

Ben Hawkes is the Global Director of Assessment Research and Development at Kenexa, an IBM Company. Graduating with BSc (Hons) from the University of Surrey and the University of North Texas, he has worked in the field of organizational psychology in the US and Europe for over fifteen years. He specializes in the use of assessment for selection and development, ranging from unproctored internet testing to interviewing and assessment centres. His research interests include the deployment of assessments to mobile devices, the use of video game techniques within assessment, and test-takers’ perceptions of animated characters, or ‘avatars’, in simulation-based assessments. He has presented at the Society for Industrial and Organizational Psychology, the Association of Test Publishers and the UK’s Association for Business Psychologists. He also delivers courses for the UK’s Chartered Institute of Personnel and Development in the area of human resource strategy.

Assessment Centers in the Future: Can Research Keep Up?
Duncan Jackson, University of East London

As the world becomes progressively more grounded on a global stage and with the advent of a global economic crisis, there are new considerations in the development of ACs in order to bring them up to date. Has research kept up with these rapidly moving changes?  What are the implications of working with diverse cultural groups in ACs and electronic devices? 

Duncan Jackson has worked as an academic at Massey University (New Zealand), at the University of Seoul (South Korea), and as a consultant for Hudson/TMP, ASSESTA, and a&dc. He has published widely on the topic of assessment centers and took on the role of chief editor in a recent volume entitled The Psychology of Assessment Centers (2012, Routledge).  He is  scheduled to be a keynote speaker at the upcoming ACSG meeting in South Africa.  Duncan has a Ph.D in Industrial and Organizational Psychology from Massey University, and currently works as a Senior Lecturer at the Royal Docks Business School at the University of East London.   

Discussion
Nancy Tippins, CEB Valtera

The discussant will close the session by walking attendees through areas of concern in technology and assessments, including: validity/reliability, realism, legal and professional standards, and applicants and their reactions. Finally, she will discuss a research agenda for the future.

Nancy T. Tippins is a Sr. Vice President of CEB Valtera Corporation.  Nancy brings more than 30 years of experience to the company where she manages the firm's development and execution of strategies related to job analysis, competency development, employee selection, assessment, and leadership development.  Nancy oversees the teams that develop legally and professionally compliant test and assessment tools, administrative processes, and delivery platforms to meet client staffing, development, and succession planning requirements.  Nancy also  conducts executive assessments and provides expert support in litigation. Nancy is active in professional affairs. She has a longstanding involvement with the Society for Industrial and Organizational Psychology (SIOP) where she served as President (2000-2001).  In addition, she served on the Ad Hoc Committee on the Revision of the Principles for the Validation and Use of Personnel Selection Procedures and was one of the U.S. representatives on the ISO 9000 committee to establish international testing standards.  She currently serves on the Joint Committee to revise the Standards for Educational and Psychological Tests.  She is a fellow of SIOP, Division 5, the American Psychological Association (APA), and the American Psychological Society (APS), and is involved in several private industry research groups.  Nancy has authored numerous articles on tests and assessments, including an edited volume on technology-enhanced assessments, Technology-Enhanced Assessment of Talent.  She has served as the Associate Editor for the Scientist-Practitioner Forum of Personnel Psychology.  She is currently on the Editorial Boards of the Journal of Applied Psychology, Personnel Psychology, and Industrial and Organizational Psychology: Perspectives on Science and Practice and is the incoming editor of SIOP’s Professional Practice Series. Nancy received her B.A. degree in history from Agnes Scott College, a M.Ed. in Counseling and Psychological Services from Georgia State University, and a M.S. and Ph.D. in Industrial and Organizational Psychology from Georgia Institute of Technology.

 Back to 2014 Conference Information Page