Home Home | About Us | Sitemap | Contact  
  • Info For
  • Professionals
  • Students
  • Educators
  • Media
  • Search
    Powered By Google

Feedback About Your Feedback on the SIOP Conference

Julie Olson-Buchanan
California State University, Fresno

Eric D. Heggestad
University of North Carolina-Charlotte

Upon reflecting on the 25 years of SIOP conferences, it is clear that the conference has grown and changed substantially since the first conference in 1986.  For example, in just the past few years, the SIOP conference has undergone a number of significant changes such as the shift from a 2½-day format to a 3-day format and the addition of theme tracks within the conference program.  Accordingly, we have increased our emphasis on conference evaluation. 

Although there have been a number of well-crafted conference surveys over the years, it wasn’t a formalized part of the conference planning process until 2008 when Eric become the chair of the newly formed Conference Evaluation Committee. It took us some time to pull together a conference evaluation questionnaire, and, as some of you may recall, the 2008 Postconference Survey was conducted in August of 2008, many months after that year’s conference. In 2009, we were ready to roll, and the survey was initiated the week after the SIOP conference in New Orleans.

The fundamental goal in conducting these evaluations is to assess your satisfaction with the conference, including anything from the content of the conference program to the conference facilities to the location. By building a comprehensive database of this information collected each year, we will be able to see how the growth, development, and changes to the conference impact attendee satisfaction. In addition to simply looking backwards (i.e., how satisfied were you with the conference you just attended), we have also attempted to be forward looking. That is, we have asked for your thoughts and ideas about future SIOP conferences and issues that might improve the overall experience of attending the conference.

In this article, we would like to do a couple of things. First, we want to provide you with some information about your general level of satisfaction with the conference. Second, we would like to focus on some of the things that we have learned through the postconference survey, highlighting how these data have shaped decisions about the conference. Third, we would like to address some things that seem, at least from our perspective, to be persistent questions about the conference. Here, we would like to address why we do things the way we do.

Before we get into the issues, however, some very special thank yous are necessary. Questar has been conducting the survey, and our contact person there over the last year, Jessica Stransky, has been very helpful (and responsive to Eric’s requests for more and different information). In addition, the efforts of Dan Beal and Lynn McFarland, who serve on the Conference Evaluation Committee, are very much appreciated as is the input from the Conference Planning Committee.

Satisfaction With the Conference

For the 2009 Postconference Survey we had 949 respondents, a response rate of 28%. Three hundred and eighty-nine respondents (41%) were Student Affiliates. Of the 560 non-student respondents, 306 individuals indicated that they would describe themselves as primarily a practitioner, and 239 indicated that they would describe themselves as primarily an academic (some individuals did not respond to this item).

Conference Satisfaction. Without a doubt, we can say that conference goers are generally quite satisfied with the conference. Ninety percent of conference attendees indicated that they agreed or strongly agreed with the item “Overall, I am satisfied with the conference.” The results differed very little between those non-student attendees who indicated they were primarily practitioners and those who indicated that they were primarily academics. Specifically, 87% of practitioners and 92% of academics indicated that they agreed or strongly agreed with this item.

Three-Day Conference Format. One issue that we have been particularly focused on is people’s reactions to the change to a 3-day conference format. It would seem that attendees are positive about the change. Only 8% of respondents in both 2008 and 2009 indicated that they disagreed or strongly disagreed that “Changing to a full 3-day format was a positive change for the SIOP conference.” Seventy-five percent and 68% of all respondents provided responses of strongly agree or agree to this item in 2008 and 2009, respectively. There have been no differences in the responses to this item between those identifying themselves as academics and those identifying themselves as practitioners.

Invited Presentations. 
Recently, SIOP began including more invited presentations on the program. The idea behind adding these sessions was to bring in external, well-known scholars or people influential in the world of business to share their thoughts with us.  To get a sense of your comfort with the inclusion of these talks to the conference program, we asked whether having invited presentations by people who were not I-O psychologists should continue to be part of the conference program. Seventy-four percent of respondents in 2009 indicated that they agreed or strongly agreed that these presentations should continue to be part of the conference (76% so responded in 2008). There was, however, a notable difference in terms of academics and practitioners on this question, with practitioners being more supportive (81% providing responses of agree or strongly agree) than academics (64%). A similar difference was also observed in the 2008 Postconference Survey.

Changes Made in Response to Feedback

Given the overall positive ratings, we have primarily used feedback to fine-tune the conference.  Fortunately, conference goers have provided us with a wealth of suggestions in the open-ended questions that we have included as part the evaluation. Although it can be daunting to read the comments (in 2009 we received almost 250 pages), we enjoy reading your suggestions. Below we describe just a few of the modifications we have made to the conference in response to the feedback.  We appreciate that these aren’t particularly earth shattering, but they represent improvements nonetheless.

  • Sharing Materials. A number of respondents raised concerns about presenters being unwilling to share PowerPoint slides or conference papers with attendees.  In response to this concern, the Program Chair, Sara Weiner, modified the Call for Proposals and presenter guidelines so that the expectation of disseminating knowledge by sharing materials is made more salient and explicit.
  • No Shows. A number of respondents raised concerns about sessions in which presenters did not show up to present. These concerns were raised in person to the SIOP staff and SIOP Conference and Program chairs as well. Upon further investigation, it was revealed that these “no shows” occurred for two primary reasons: (a) elimination of funding for conference travel for presenters and (b) legal departments determining the material was proprietary and could not be presented.  These concerns generated a considerable amount of discussion as to the expectations of individuals who submit proposals to SIOP.  In response to this concern, Sara Weiner and her Call for Proposals Subcommittee (chaired by Eden King) made a modification to the proposal process. Specifically, proposers must agree to two stipulations before submitting proposals for review:  (a) that they have already verified that they have the legal right to present the material at SIOP and (b) that all presenters submitting proposals are committed to presenting at SIOP, regardless of changes in funding.
  • Quantity and Type of Food.  Other comments raised about the 2008 and 2009 conference relate to the type and quantity of food offered at the coffee breaks. In particular, a number of respondents indicated they would appreciate healthier options.  Accordingly, we have sought out healthier choices such as fruit and yogurt when available and economically feasible.  Similarly, others raised concerns about the break food and drink being exhausted before breaks began. Hotel staff often set up the food and drinks 15–30 minutes prior to the scheduled break to ensure it is in place. As a result, conference goers who were not in sessions were reaching the food and drink first.  To address this issue, we’ve been working with hotel staff to set up the food and drink just prior to the beginning of the break so that everyone will have access to the food and drinks.

Why Do We Do the Things We Do?

We appreciate the questions and comments raised on the conference survey as well as in person because it has provided us the opportunity to re-examine why we do the things the way we do them!  In this last section we would like to address some of the questions that have been raised by a number of conference goers that ultimately did not result in a change per se but are very good questions that merit an explanation.

Why don’t we have a lunch break?   This question has been raised by several conference attendees.  With 20 concurrent sessions, a 1-hour lunch break on each of the 3 days of the conference would result in 60 fewer hours of programming time.  We thought we couldn’t afford to lose that much programming time, but perhaps we could manage a lunch break if we eliminated the two ½ hour breaks. Accordingly, we included a question on the 2008 Postconference Survey that asked conference goers if they would prefer to keep our two ½ hour breaks or have a 1-hour lunch break instead. Fifty-two percent of respondents indicated that they preferred the two ½ hour breaks. These data and several logistical factors led us to keep the two breaks instead of the lunch break.  The primary concern is there would be bottlenecks that could be created by our 3,500–4,000 conference attendees exiting the property, ordering lunch (in the immediate area), and returning to the conference rooms at the same time.  As a result, we decided it would be best to continue to not provide an official break in the program for lunch so that we could continue to maximize our program time and prevent logistical problems.

Why do our sessions start and end at different times?  Some conference goers have expressed frustration that our program does not have common start and end times for each session.  Again, this scheduling approach is used due to logistical concerns.  We have an unusually large number of conference attendees for any hotel, no matter how large it may be.  If sessions ended at the same time we would have bottlenecks in the flow of people walking from session to session (we even have bottlenecks with the staggered times!).  In addition, using staggered session times also allows us more flexibility in working around first-author time conflicts when the program is set up.

Why don’t we provide information about all the receptions and parties held in the hotel?  The conference schedule provides information about all the receptions that are sponsored by SIOP.  These receptions are open to all conference registrants.  However, several companies and universities host parties in the hotel (or off site) that are not an official part of the conference, and accordingly, these hosts have control over how and to whom the details of their events are communicated.

As a community, we are quite satisfied with our conference. Yet, there will always be ways to improve the conference so that it meets your needs and results in a more productive and enjoyable event. We appreciate the informal and formal feedback you have provided us about the conference, and we hope you recognize that we are listening to you. Your thoughts and comments have proven to be very valuable in helping us make decisions about future SIOP conferences. You will be receiving a solicitation to participate in the 2010 Postconference Survey a few days after you return home from Atlanta. Please take the time to participate and to share your reactions and thoughts on the conference.