Category Archives: Market Research

Survey Reporting Basics, An Example of What Not to Do: King County’s Employee Survey

I’m writing a series of posts to provide guidance to people who are learning the basics of market research. I’ll be using real-life examples of both great and bad research, analytics, and reporting.

In this first post about reporting survey data, we’ll discuss when it is appropriate to report a Top Box Score (i.e., % of respondents who gave a “5”) or a Top 2 Box Score (i.e, % of respondents who gave a “4” or a “5”)? I’ll use King County’s employee survey as an example of what not to do.

King County conducts an annual employee engagement survey for their approximately 14,000 employees. Below is how they asked the question in 2012.

Q. King County employees are treated with respect, regardless of their race, gender, sexual orientation, gender identity or expression, color, marital status, religion, ancestry, national origin, disability or age. (Using a scale where “1” means “I strongly disagree and “5” means “I strongly agree”.)

Here are the results from their 2012 report (with my mark-ups):

2012 updated

My interpretation of this chart, using a Top Box score, is that 56% of employees are treated with respect but 44% (4 out of every 10 employees) aren’t treated with respect for some reason at work. That is nearly half of all employees! To present the data more positively, as King County did, you might use a Top 2 Box and say: “…more than 90% [answered] positively”. NO. JUST NO. Why? Continue reading


Survey Basics: Instagram Example

I received a survey from Instagram asking about which features I use. Can you spot the small, meaningful omission in the question below?

instagram survey

Continue reading

Why Context Matters in Survey Research

Survey questions with context ambiguities are usually a sign that the researcher either doesn’t understand the context the respondent might draw upon when answering the question or the context within which the company might use the data.  One way to address both kinds of context ambiguity is through cognitive interviewing: having test respondents thinkaloud while they answer survey questions. The interviews help ensure respondents are interpreting questions the way researchers intended and also that they are measuring what they intended. Here’s an example that shows why cognitive interviews are usually a good idea. It is from a survey asking about a large multinational bank.


Personally, my life’s dreams can’t be achieved through “prioritizing financial goals”. They in fact, aren’t related to financial goals and that is not the fault of this bank. Continue reading

Increasing Social Media Adoption or Aging Effect Among 50+ Year Olds?

The Pew Research Internet Project recently tweeted about an article about social media usage and age groups. It included the chart below that shows there is a generally upward trend (since 2006), which might be flattening, in social media usage across all age groups. Whenever I see a chart on age and something technology related, I always wonder if there is increasing adoption of the technology among the older age groups or if folks are merely aging into a higher age group.


Continue reading

Question Writing 101 from Joe Dumas

I created and presented this summary as part of my usability testing class (HCDE 517) at the University of Washington. It covers basic question writing concepts that are applicable to both market researchers and usability researchers.

For me, the highlight of this article was Joe’s discussion on labeling the end points of scales. He reminds even us experienced question writers that: one’s choice of scale end points might activate two different cognitive structures instead of one. Measuring something as being “difficult” is not the same as measuring it as being “not at all easy”.

How Market Research Experience Can Mislead You in Usability Studies

Diving into the emerging field of usability research after nearly a decade in traditional market research, I’ve learned that some concepts and processes are similar in both fields (e.g., surveys). There are, however, more differences than I expected. For example, when I first heard the term usability testing I translated that in my mind as in-depth interview. As I’ll discuss in a minute, there are significant differences even though the two concepts are related. Continue reading

Sauro’s 4 Steps to Translating a Questionnaire

The process for translating questionnaires is one thing that is exactly the same in Market and Usability Research. Jeff Sauro, a highly regarded usability professional, posted “4 Steps to Translating a Questionnaire” on his blog last month. Continue reading