Jenny Baker
/ Categories: 584

SIOP Award Winners: Wiley Award for Excellence in Survey Research

Liberty J. Munson

As part of our ongoing series to provide visibility into what it takes to earn a SIOP award or grant, we highlight a diverse class of award winners in each edition of TIP. We hope that this insight encourages you to consider applying for a SIOP award or grant because you are probably doing something amazing that can and should be recognized by your peers in I-O psychology!
This quarter, we are highlighting the winners of the Wiley Award for Excellence in Survey Research: Christopher Patton (L) and Justin Purl.

What award did you win? Why did you apply (if applicable)?

We won the Wiley Award for Excellence in Survey Research. We applied because we felt that, with Heartbeat analysis, we really had an analytical breakthrough—a new way to analyze survey data—and wanted to start the conversation with the larger I-O community on this new technique.

What is Heartbeat analysis?

Heartbeat analysis uses the variety of topics in the survey to get a sense for each person’s general sentiment (baseline) and then identifies the topics where the survey taker makes an unexpectedly high or low rating compared to their baseline. In other words, Heartbeat analysis is identifying when someone answers a survey question with greater passion than normal. 

Share a little a bit about who you are and what you do.

We both are People Analytics researchers at Google, doing research on both employee listening and selection.

Describe the research/work that you did that resulted in this award. What led to your idea?

It really started when Chris began thinking deeply about not just employee attitudes, but particularly strong attitudes. It started with Chris’ interest in the rare use of “strongly disagree” in employee listening surveys and the predictiveness of these rare signals. For Justin, Heartbeat is a natural next step from the concepts of applied multilevel modeling and profile similarity scoring. Once Justin and Chris started talking, the conversation shifted toward attitudes being relative to each person and how it would be important not to focus on absolute values but strong signals for each person, depending on their general disposition.

What do you think was key to you winning this award?

I think the uniqueness of the approach. We flipped the focus of analysis from item-level aggregation, which is a very common analytical approach in industry, to the individual level of analysis. By focusing on each individual, we were looking to account for everyone’s uniqueness at scale to find interesting insights.

What did you learn that surprised you? Did you have an “aha” moment? What was it?

When we compared the top favorable results from our Heartbeat analysis, what we call “up-votes,” to that of the more traditional percent-favorable calculations, we found that the number one up-vote item was the 14th most favorable in the survey. It dawned on us that we were finding unique insights when looking at the individual level of analysis. Indeed, sometimes practitioners only present the top 10% favorables, and our results suggest that important employee sentiment is not being heard in the traditional method.

What do you see as the lasting/unique contribution of this work to our discipline? How can it be used to drive changes in organizations, the employee experience, and so on?

I think a lasting contribution will be a new analytical approach the I-O community can use to unearth new and interesting insights but also a potential shift to focus and think about the individual level of analysis—capturing people’s uniqueness at scale and turning that into action due to more differentiation between topics, clarifying exactly where to act.

Given this was a new analytical technique, how did you create the R package or script needed to conduct the analysis?

That was all Justin. Justin is amazing at many things, and R is definitely one of them. He took our theoretical idea and turned it into a R script in 20 minutes. Without Justin, I'm not sure this would have been translated into R so easily. We shared our R code in our Wiley Award paper submission (see below), and Justin is actively working on creating the code into a R package that anyone can use.

--Heartbeat analysis R Code--

The code operates on data structured in wide format where each respondent has one row with many columns corresponding to items. A list of items for the analysis is created by specifying the likert.items object. The example uses two items (not recommended) called Item.A and Item.B. The code produces an object that contains all of the items specified in the list re-scored to up-votes or down-votes using the heartbeat formula for a threshold (in number of standard deviation units) applied to individual-level standard deviation metric scores (like z-scores). Positive threshold values produce up-vote and negative threshold values produce down-vote scoring.

likert.items <- c(“Item.A”, “Item.B”)

mydata.heartbeat <- mydata %>%


mutate( =



(select(mydata , likert.items) -

                 rowMeans(select(mydata, likert.items))


      na.rm = TRUE

       ) /

     (length(likert.items) - 1)


        individual.mean = rowMeans(

select(mydata, likert.items),

na.rm = TRUE


    ) %>%




  .$ == 0 ~


                 .$ != 0 ~

 (x - .$individual.mean) / .$




How did others become aware of your award-winning work/research? 

Initially, we socialized the new analytical technique internally at Google and found a lot of excitement around the approach. We were strongly encouraged to apply for the Wiley Award from another Googler, Molly Delaney, after she learned of the technique.

Who would you say was the biggest advocate of your research/work that resulted in the award? How did that person become aware of your work?

Molly Delaney was such a big advocate early on and really championed the work. Molly is the People Analytics lead at X, formally Google[X]. She became aware of the work by almost accident. Chris and Molly were having a catch-up meeting and the conversation turned to research, and he mentioned the new idea to Molly—it was then that Molly immediately saw the value and application of Heartbeat analysis. 

To what extent would you say this work/research was interdisciplinary? 

Heartbeat analysis draws on ideas, like ipsatizing and profile similarity scoring, from many subdisciplines of psychology (e.g., personality, social, cognitive). Outside of psychology, marketing and economics have similar concepts with respect to evaluating opinions against baselines, but we consider Heartbeat analysis a product of psychology.

What was the “turning point” moment where you started thinking about the problem/work through the other disciplines’ lenses? 

Chris was very interested in the idea of strong attitudes at the time of the initial idea, so things like theory of planned behavior by Ajzen or the work of Cialdini around commitment, how people are motivated to behave and act consistently in ways that align with their prior commitments, helped shape his thinking to figuring out ways to identify these strong attitudes at scale because prior theory would suggest these strong attitudes would influence future behavior.

What, if any, were the challenges you faced doing this work across disciplines (e.g., different jargon)? 

There were some many similarities between our Heartbeat analysis and other techniques (e.g., profile similarity scoring, mean centering, ipsatizing), so the real challenge was articulating how our analysis occupied a different area. For example, the ipsative method in personality psychology is very similar to what we are doing. By the technical definition of ipsatizing (i.e., a set of variables is called ipsative when the summed scores for each individual are the same; Ten Berge, 1999), Heartbeat analysis is different insofar as we are submitting only the “extreme within-person scores” to further analysis, which results in not everyone having the same summed score. In a sense, we are combining the ideas of ipsativity and the extreme-groups approach together.

How do you think the work benefitted by having multiple disciplines involved?

The fact that so many different disciplines were touching around this idea gave us confidence that we were on to something. Finding that another discipline has a similar concept shouldn’t be the end of a theory or approach but fuel the search for the underlying problem that both disciplines are trying to solve.

What recommendations would you give to others if they are doing interdisciplinary research? 

Have a conversation with others from different disciplines, and/or read a summary article or two to see if you can make unexpected discoveries.

Are you still doing work/research in the same area where you won the award? If so, what are you currently working on in this space? If not, what are you working on now, and how did you move into this different work/research area? 

Yes, there are many unanswered questions for Heartbeat analysis. For we are exploring how to determine an optimal within-person standard-deviation threshold for use in determining up- and down-votes. For example, we decided on using one within-person standard deviation, but future investigation can assess whether a 0.5, 1.5, or 2.0 within-person standard deviation provides additional unique and valuable information and under what circumstances each of the within-person standard-deviation cutoffs are the most valuable. Another area of us to investigate is how survey length impacts Heartbeat analysis (e.g., is the analysis more beneficial for longer surveys?).

What’s a fun fact about yourself (something that people may not know)?

Chris grew up with pet skunks and personally gave Morgan Freeman a tour of his childhood home. Justin uses Google Translate to understand his 2-year-old when he speaks Mandarin (“Zhège! Zhège!”).

What piece of advice would you give to someone new to I-O psychology? (If you knew then what you know now…)

Learn the core about I-O (e.g., job attitudes, selection, motivation, etc.), and learn the areas deeply—once learned, we find it helps to begin reading outside of the field (e.g., in biology) to find inspiration for new ideas to test in I-O.


Ten Berge, J. M. F. (1999). A legitimate case of component analysis of ipsative measures, and partialling the mean as an alternative to ipsatization. Multivariate Behavioral Research, 34, 89–102.


Liberty Munson is currently the director of psychometrics for the Microsoft Technical Certification and Employability programs in the Worldwide Learning organization. She is responsible for ensuring the validity and reliability of Microsoft’s certification and professional programs. Her passion is for finding innovative solutions to business challenges that balance the science of assessment design and development with the realities of budget, time, and schedule constraints. Most recently, she has been presenting on the future of testing and how technology can change the way we assess skills.

Liberty loves to bake, hike, backpack, and camp with her husband, Scott, and miniature schnauzer, Apex. If she’s not at work, you’ll find her enjoying the great outdoors or in her kitchen tweaking some recipe just to see what happens.

Her advice to someone new to I-O psychology?

  • Statistics, statistics, statistics—knowing data analytic techniques will open A LOT of doors in this field and beyond!

518 Rate this article:
Comments are only visible to subscribers.

Theme picker