Home Home | About Us | Sitemap | Contact  
  • Info For
  • Professionals
  • Students
  • Educators
  • Media
  • Search
    Powered By Google

Workshop 11 (half day)

Exploring New Frontiers in Test Security: Approaches for Protecting Your Testing Program

Presenters: Monica A. Hemingway, Starwood Hotels & Resorts
        Worldwide, Inc.
  Eugene Burke, SHL
  Dennis Maynes, Caveon Test Security
Coordinator: Liberty J. Munson, Microsoft Corporation

Test security is becoming increasingly important to organizations and assessment providers as the use of assessments increases over the employee lifecycle (e.g., recruitment, selection, training, development, succession planning, etc.) and Internet-based testing becomes more common. Protecting the integrity of the testing program to ensure that sound employment decisions can be made is paramount. The good news is that I-O psychologists do not need to start from scratch—we can learn from the experience and technologies that have been developed in educational and accreditation testing. 

This workshop will provide a hands-on approach to understanding the importance of test security, policies and procedures for mitigating threats to your testing program, how to leverage test security techniques used by other testing bodies, and how statistical analysis techniques can be used to promote fairness and improve testing program integrity.

A case study will be used to lead participants through hands-on exercises. Workshop leaders will provide sample test security policies and procedures, demonstrate the use of data forensic tools, and present a comprehensive approach to test security that can successfully protect your testing program and mitigate testing anomalies.

At the end of this workshop, participants will be able to:
• Describe threats to the integrity of testing programs and their impact on test takers, constituents, and the general public
• Explain policies, processes, and procedures for protecting testing programs
• Use statistical analysis techniques for detecting test irregularities and explain how, in general terms, these analyses work
• Describe how statistical analysis techniques have been successfully used in mitigating test theft and fraud
• Explain the legal defensibility of using statistical analysis techniques to invalidate scores and results
• Describe how statistical analysis techniques are used within a test security framework

Monica A. Hemingway is senior director of Selection and Assessment at Starwood Hotels & Resorts Worldwide, Inc., where she leads the development and implementation of assessment programs throughout the organization.  Before joining Starwood, Monica held leadership positions at Valtera and Dow Chemical, focusing primarily on global selection test development and implementation, and at The Chauncey Group International, where she was responsible for research, development, and statistics for the Test of English for International Communication (TOEIC), one of the world’s largest certification programs with over 1.5 million test takers per year.  Her focus has been largely in the areas of test development, validation, and implementation, particularly in high-stakes (e.g., certification, selection/promotion) and global settings, and she has developed systems and tools for the evaluation and selection of executive, professional, administrative, and technical employees in a wide variety of countries.  She has extensive experience in maintaining the integrity of testing programs, including developing, applying, and training others in ways to prevent, detect, and handle breaches to test security.  Monica received her PhD in I-O psychology from Bowling Green State University. 

Eugene Burke is director of science and innovation at SHL, responsible for product design, psychometric technologies, and the development of cheat resistant multilanguage solutions for online ability testing.  As part of his responsibilities and working with partners such as Caveon, Eugene has developed test security processes monitoring and responding to potential security breaches worldwide, data forensic audits to monitor test-item exposure and the integrity of online testing programs, as well as verification procedures for validating unproctored test scores. Eugene’s career spans 28 years beginning as a research scientist for the UK Ministry of Defense including a tour with the United States Air Force where he developed computer-based measures of information processing and attention for aircrew selection.  He has led applied units for a variety of organizations, including the London Fire Brigade, and has consulted with both private- and public-sector clients in the UK, U.S., Europe, and Asia. He is a past chair of the British Psychological Society’s Steering Committee on Test Standards, past chair of the Division of Occupational Psychology, is a past council member of the International Test Commission, and is currently the secretary to the European Association of Test Publishers. He has published several articles and book chapters on psychometrics, computer and Internet testing, personnel selection, coaching and development, test security, and test localization and adaptation. He is a regular presenter at professional conferences including the Association of Test Publishers (U.S.), European Association of Work and Organizational Psychology (Europe), International Test Commission (ITC), BPS Occupational Psychology Conference (UK), Military Testing Association (U.S. and Europe) and various aviation psychology conferences in Europe and Asia. Last year, he presented at SIOP on how to develop cheat resistant solutions to unproctored Internet testing, which is currently being considered as an article for the SIOP journal.

Dennis Maynes has a master’s degree in statistics from Brigham Young University and has extensive and varied experience in research and development activities. Dennis’ background includes research and development in computerized testing, including systems for the generation and scoring of computer administered tests from criterion-referenced item banks and systems for the administration of computerized adaptive tests. He also has background and expertise in linear and nonlinear modeling using regression, neural networks, and sequential models. Dennis’ work experience includes over 10 years of development of computer-based systems for delivering curriculum and online tests at Wicat Systems and Wicat Education Institute; 3 years development deploying systems for the management of rack-based server equipment at Intel; and 3 years developing mathematical models and algorithms at Fonix in speech recognition research. Dennis’ current interests and emphasis are in the development and usage of testing models to test for change and aberrant patterns. He is also actively pursuing applied research in optimal sequential model selection for pattern recognition.


Return to Workshop List

Return to Conference Homepage