Home Home | About Us | Sitemap | Contact  
  • Info For
  • Professionals
  • Students
  • Educators
  • Media
  • Search
    Powered By Google

 2005-2006 __Program Committee Goals, Progress Report

Submitted by: 
Julie B. Olson-Buchanan

December 20th, 2005

Committee Members:

SIOP 2006 Strategic Program Planning Subcommittee


Baltes, Boris

Invited Sessions

D'Egidio, Erika

Invited Sessions

Hattrup, Keith

Invited Sessions

Cohen, Robin

Invited Sessions, Chair


Aguinis, Herman


Barney, Matthew


Ford, Deborah (Debbie)


Rupp, Deborah


Squires, Paul


Lualhati, Joselito (Joe)

IP/COI/CE, Chair


Duffy, Michelle


Finkelstein, Lisa

Proposals/Flanagan (Past Program Chair)

Mead, Alan


Horvath, Michael (Mike)

Proposals/Flanagan, Chair


Beier, Margaret

Reviewer Recruiting

Lovato, Chris

Reviewer Recruiting

Straus, Susan

Reviewer Recruiting

Beal, Daniel (Dan)

Reviewer Recruiting, Chair


Gruys, Melissa

Sunday Seminars

Handler, Charles

Sunday Seminars

Kaufman, Jennifer

Sunday Seminars

Sanchez, Rudolph (Rudy)

Sunday Seminars

Allen, Tammy

Sunday Seminars, Chair (& Program-Chair-In-Training)


Bruck, Carly

Sunday Theme

Hunt, Steve

Sunday Theme

Payne, Stephanie

Sunday Theme

Boswell, Wendy

Sunday Theme, Chair


1. Explore ways to improve SIOP formats and clarify distinctions between formats.

a. Examine issues/questions that have previously been raised about differences between formats/sessions.

This is complete. The Call for Proposals/Flanagan committee (Mike Horvath (chair), Michelle Duffy, Lisa Finkelstein, & Alan Mead) gathered the questions raised/asked in previous years and identified areas that needed further clarification.  In addition The Call was examined for redundancy and ambiguity.  In particular, we determined that it would be helpful to change the terminology used in The Call to distinguish between Proposal Format (the format required to submit the proposal) and Session Type (the nature of the session as it would be presented at SIOP).  In addition we changed the name of the Practitioner Forum to Practice Forum to better reflect the content rather than a likely presenter.

            b. If needed, make adjustments in the Call for Proposals

            This is complete. The Call for Proposals document was modified quite a bit. Additional details were provided within the document and several sections were modified or consolidated to eliminate redundancies.  The most significant change is the terminology used to describe the 3 types of proposal formats and 11 session types.

            c. Examine the success of the new 2005 formats Academic-Practitioner collaborations & Theoretical Advancements.

            This is complete. The Call for Proposals/Flanagan Award committee surveyed the presenters who had pioneered these two new categories of Session Types in 2005. The feedback received from the presenters was used to modify the descriptions in the Call and the Proposal Format requirements.  In particular, based on the feedback we received, we decided to allow submitters to choose between two proposal formats (Multiple Presenter or Open Format) for preparing submissions. 

d. Evaluate the success of adjustments made to the Community of Interest sessions and Interactive Poster Sessions.

This is complete.  

The Community of Interests (COI) sessions were evaluated in two ways.  First, Joselito Lualhati (2006 chair of Interactive Posters/Community of Interests/GE credit) and Mike Horvath (2005 chair of the same committee) informally polled a few 2005 SIOP COI attendees about the success of these sessions. Second, information about the number of attendees was gathered from attendance records taken by student volunteers who were present at all sessions.  The general conclusion from the committee (Joselito Lualhati (chair), Herman Aguinis, Matthew Barney, Deborah Ford, Deborah Rupp, and Paul Squires) is that the COI sessions are not particularly effective at this point.  The subcommittee is addressing this issue by experimenting with ways to make them more effective. One experimental approach is to have a known host for some of the sessions.  The committee has selected several topics and has invited well-known hosts in the area to serve as facilitators.  The second experimental approach will be to use an online suggestion system a few weeks before SIOP to create the topics for the rest of the sessions. This should allow us to capture the newest topics that conference attendees would like to talk about with others that may not otherwise be in the program. Another change being implemented this year is that all community of interests sessions will be held in a separate room instead of an open area (the open area created some confusion as to who was there to participate versus to use an open chair).  In addition, we will be providing some light refreshments (not-advertised, but sponsored) for participants.

Last year a formal survey of the effectiveness of the Interactive Poster sessions was conducted.  Accordingly, a number of changes were made to the process by which the Interactive Poster sessions were conducted. Informal feedback received at the 2005 SIOP was enthusiastic.  We also polled 2005 facilitators for their recommendations.  The 2005 facilitators were similarly enthusiastic but did suggest the facilitators names be included in the program if possible. As a result of their feedback, this year we will be including the facilitators names in the program to provide recognition as well as accountability. 

            e. Explore ways to provide financial support for external invited speakers.

Last year there was $3000 in the budget for an external speaker. An additional $3000 was proposed and accepted for this years budget.  The $6000 will be used to support invited speakers for both the invited sessions in the regular program and for the Sunday theme sessions. The funding allowed us to invite several outside speakers to create 3 invited sessions that are interdisciplinary in nature.  The first session is focused on the topic of coaching and includes several outside (partially funded) speakers who come from different backgrounds (e.g., law, other areas of psychology). The second session is on the topic of ethics and includes both the current and former APA Ethics Directors (both partially funded).  The third session is the invited session for the Sunday Theme on Crossing Disciplinary Borders. This session focuses on scholars outside of I/O psychology whose research topics cross into the area of I/O.  Financial support will be used to partially-funded two of the speakers from engineering and educational psychology. 

            f. Create guidelines for identifying which sessions should be in the program outside of the review process.

This was discussed briefly at the Conference Planning Meeting in Dallas in June.  Specifically we discussed which sessions would be in the 2006 program outside of regular review.  We are still in the process of discussing how to handle this issue and will continue a discussion at the February meeting in February.  We anticipate we will have something in place in the last quarter. 

2. Continue to improve the submission process.

            a. Review the website and make suggestions for improving instructions as needed (see items 1a & 1b above).

This is complete.  The online submission website was modified to accommodate the changes in the Call for Proposals.  In particular it was reprogrammed to reflect the new terminology and to allow submitters to select the submission format for the session types that allow a choice.

            b. Examine the content areas and make changes as needed.

This is complete. Changes were made based on observations made from last years reviewer assignment process (e.g., submissions that were originally placed in the Other category). The content areas of Group Processes/Dynamics and Work Groups/Teams were combined into a new category:  Work Groups/Teams and Group Processes.  Three new content areas were added: Emotions at Work (19 submissions), Item Response Theory (5 submissions), and Counterproductive Behavior and Workplace Deviance (26 submissions).  The number of IRT submissions was lower than we anticipated, but it was particularly helpful for assigning appropriate reviewers. 

3. Continue to improve the reviewing process.

            a. Explore ways to lighten the load for reviewers.

This is complete.  First, we scheduled the reviewer assignment and program scheduling sessions further apart so that reviewers would have an additional week to review submissions (4 weeks instead of 3).  Second, we created a new reviewer recruiting letter designed to increase the number of reviewers.  Third, we expanded the use of Student Affiliate Reviewers (see c & d below).  As a result of these efforts we were able to recruit 1097 reviewers several hundred more reviewers than were previously recruited.  Most reviewers received only 3-4 papers to review and no reviewers received more than 5 to review.  We believe this effort was very successful in reducing the load for reviewers and increasing the likelihood that reviewers would be willing to review for us in the future. 

            b. Make efforts to recruit senior, expert reviewers.

We recruited 167 expert reviewers this year.  Unfortunately the number from last year is unavailable, so it is not clear if this is a change from last year. We did identify a potential problem with the expert classification system in that several people we would consider to be expert reviewers were not identified as such by the system.  I will be working with next years program chair (Tammy Allen) to examine the criteria  before next years reviewing process.

            c. Explore criteria for including graduate students as reviewers

           This is complete.  Last year Lisa Finkelstein pilot-tested the use of a select group of student affiliate members as reviewers last year.  She used several criteria for students to be eligible to review: 1) be ABD 2) have presented at SIOP as a first author and 3) be recommended by faculty. This year Dan Beal examined the ratings from the pilot study.  The Student Affiliate reviewer ratings are very similar to the Member ratings.  Specifically, none of the means are significantly different from each other (nor is the multivariate difference significant), separate factor analyses (using items as a scale) provide almost identical unidimensional results (the factor loadings are very close and are in the exact same order), and the reliabilities for both are similarly high and they exhibit the exact same pattern of correlations between the facet items and the overall item.  This year we decided to expand the use of Student Affiliate Reviewers.  We retained two of the original three criteria:  1) be ABD (defined as defended dissertation proposal) and 2) previously presented at SIOP as a first-author.

d. Examine ways to incorporate graduate student reviewers in the automatic reviewer assignment process.

This is complete.  The reviewer sign-up site was modified for Student Affiliate use.  Specifically, the program requires Student Affiliates to indicate they have met both criteria before proceeding to the sign-up process.  A recruiting letter was sent this week (August 16th) and sent to Student Affiliates. We recruited 86 student reviewers.

e. Explore ways to recruit reviewers in content areas that are under-represented

Dan Beal conducted a separate analysis of the 2005 submissions by content area relative to the reviewer expertise ratingsWe identified the areas of Diversity, Organizational Justice, Work and Family, Work Groups/Teams, and Occupational  Health, Safety, and Response as areas that are under-represented by reviewers. We incorporated this information into the reviewer recruiting email with the hope that individuals with expertise in those under-represented areas would respond. This approach appears to have been successful as was evident by the reviewers reported areas of expertise (see below). Please not that an A level of expertise indicates the reviewer believes he/she is very knowledgeable and a B level of expertise indicates the reviewer is somewhat knowledgeable about the content area.

Diversity: 140 A level expert reviewers & 293 B level reviewers

Organizational Justice: 90 A level expert reviewers & 159 B level reviewers

Work & Family: 116 A level expert reviewers & 264 B level reviewers

Work Groups/Teams: 173 A level expert reviewers & 306 B level reviewers

Occupational Health: 62 A level expert reviewers & 149 B level reviewers

f. If needed, update reviewer guidelines

This is complete. I revised the reviewer guidelines and both the Call for Proposal committee and Reviewer Recruiting committee examined it and provided feedback.

g. Explore ways to minimize the manual assignments of the other category

We decided to modify the reviewer assignment program to initially assign reviewers to the other category based on the secondary content area.  We discovered this was a considerable help at the Reviewer Assignment meeting.  Although we still manually checked the reviewer assignments for the Other category, most the initial assignments were appropriate and required fewer changes than previous years.

h. Examine ways to prevent reviewer assignment emails from being treated as junk mail.

This is complete.  Part of the problem is that spam filters identify a document with several links in it (such as our reviewer assignment emails) as spam. We sent the reviewer assignment email the way we have previously, but we also sent an email (without links) indicating we sent the other email.   This cut down on the number of people who were unaware of their reviewer assignments considerably.

4. Explore ways to further improve the scheduling process.

            a. Make suggestions for modifying the scheduling program to minimize content overlap.

This process is complete.  Based on last years programming meeting, I forwarded several possible solutions to Larry Nader and Milt Hakl. Larry and Milt  examined the alternatives and decided the best way to address it would be to create a report that identified titles & content areas scheduled for a given time.  This allowed me to go back and examine if there were any glaring content conflicts.  Although some conflicts are unavoidable, there were a few clear content conflicts that I could change before sending out the acceptance letters.

            b. Examine whether serving as Chair of a session should count toward the submissions count to minimize scheduling conflicts.

This is complete.  The advantages and disadvantages were discussed at the Conference Planning meeting.  We decided, given the scheduling constraints, that it would be best to eliminate the Chair exemption.  The Call for Proposals was modified and the change was highlighted.