Sie sind auf Seite 1von 8

Best Practice Brief

Survey Design and FAQs

Survey Design Best Practices


Does the survey flow like a funnel?
Envision the questionnaire as a funnel and imagine
an inverted pyramid: start out broad, and then go
narrower as needed. This is a logical way for the
respondent to think, making it easier for them to
answer the questions. Use transitional and segue
language to change topics. Usually, sections of the
survey are connected by topic and can be grouped.
Each block might be tied to a particular research
objectivekeep reminding yourself of them to
make sure you dont miss anything.

Introduction
Screening & classification
Awareness questions
Usage questions
Specific attitude questions
Any general attitude/beliefs
Sensitive questions
Other comments
Thanks & close

Is there an order bias?


Sounds like a no-brainer, but its critical not to give away the answer to a question before asking
it (e.g., sharing brand names before asking unaided brand awareness).
Ensure earlier questions do not influence responses to later questions as well (e.g., dont reveal
new and exciting information about a brand and then ask about how much the respondent likes
a brand or finds it innovative).
Order bias can also come into play if lists are always in the same order. Randomizing is a good
default rule of thumb except for questions that may work better if it is always shown the same
way (e.g., cities listed in geographical order or alphabetical order).

Is the tone correct?


Dont forget to ensure the tone of the wording is reflective of the target audience; you may word
questions differently when speaking to men aged 18-24 who are into video games versus
professional lawyers. The rule of thumb is that conversational is still always best.

Are you leading the witness?


A loaded or leading question will bias peoples responses. Be careful about asking a question
that includes or implies the answer youre looking for, presumes something is already the case,
or uses loaded words that create an immediate positive or negative reaction. For example:

What do you think of the horrible effects of pollution? This is a loaded question because it
states pollution is horrible.
Have you stopped drinking alcohol? This presumes the person already drinks alcohol.

Vision Critical 2013 www.visioncritical.com/university

Best Practice Brief


Survey Design and FAQs
Have you used visual questions?
Visual questions increase respondent engagement by presenting questions with interesting
visual elements or engaging interactions that provide novel ways for users to choose answer
options. Examples of visual questions include card sorts, magnetic boards, and highlighter
questions types. The following are some rules of thumb for using visual questions well:

Overall Page Layout. How is the overall page layout? Step back from your computer to
look at the layout as a whole. If your first reaction is blech, go back to the drawing board.
Legible Font Size. Is the font size legible? Remember that some people are at the bifocal
stages of life, and may have to squint to read those 9-point labels. Keep your fonts large
enough to be readable, and ask a couple of your glasses-wearing colleagues to take a look
before you deploy.
Succinct Labels and Instructions. Are your labels and instructions succinct? Your labels
should be short enough to be readable at a glance, and your instructions should be concise
enough to be scanned quicklyotherwise, you are undercutting the value of asking a visual
question by making people do a lot of reading.
Appropriate Visual Questions. Are you using the right kind of visual question? Not all
question types are interchangeable. For example, visual grids are good for eliciting
comparative responses, while magnetic boards are good for monadic association
assessments (i.e., asking people to evaluate a single concept).
Varied Visual Questions. Are your visual questions varied? Grid after grid after grid will rob
a respondent of passion for doing your survey faster than almost anything else, so be sure
to vary the visual question types you use. Follow a visual grid with a slider, a single choice
with card sort.
Repeating Grid Headers. Do your grid headers repeat? Nothing is more frustrating to
people than losing which question belongs in which column simply because the headers
have scrolled off the screen.
Same Size Images. Are your different sized images going to lead to a biased response?
Ensure any images used are roughly the same size to keep it visually appealing and make
sure that one answer doesnt stand out too much from the others.
Edit Answer Layout. Do your answer options look appropriate on the page? Sometimes
answers may look best in a list and other times they may look better in multiple
columns/rows.
Spacing of magnets with magnetic board question. Ensure that you have spaced out
your magnets in such as way so that if the magnets are randomized, they do not overlap
each other.

Vision Critical 2013 www.visioncritical.com/university

Best Practice Brief


Survey Design and FAQs
Is there an introduction, segue, and thank you?
To polish your questionnaire, ensure that you include an introduction that explains a little about
what you are asking and why, without going into specifics. It is usually good practice to let
people know how long it will take them to complete the survey.
Similarly, at the end of the survey, be sure to thank respondents for their time. Here you might
want to provide more detail about the purpose of your research, or provide relevant links.
Throughout the questionnaire, especially longer ones, it can be helpful to use segues to make
the transition between topics smooth (e.g., now moving on to something else), or to act as
encouragement (e.g., the final section of the survey is about you).

Does it feel too long?


Looks can be deceiving, so always time your questionnaire to ensure you are within the
specified parameters you set out. If possible, have someone else review your questionnaire for
length. Dont beat around the bush; just ask what you want to ask. The rule of thumb is that
anything longer than 20 questions (or 5 minutes) for each member should probably be split into
two projects. Its a good idea to tell members the approximate survey length in the invite.

Will the person understand the question?


If your terminology or the language used isnt clear to you, it wont be clear to your audience.
And if theres any ambiguity, it will likely end up creeping into your results. Use clear and
intuitive language and ask questions that are sufficiently specific. For example, dont ask, How
much do you earn? Instead, be specific and ask, What was your annual household income
before taxes in 2012?
Also, be careful about having response options that are a double negative. For example, Do
you agree or disagree with the following statement? Teachers should not be required to
supervise their students during recess.
If the person disagrees, they are saying they do not think teachers should not supervise
students. In other words, they believe that teachers should supervise students. If you do use a
negative word like not, consider highlighting the word by underlining or bolding it to catch the
respondents attention.

Will the person be able to answer the question?


When creating a list of possible responses, always put yourself in the shoes of the person
completing the survey and imagine if theres at least one answer that every person can
truthfully provide. If thats not the case, include more answers in your list, add an Other option,
or provide an opt-out (e.g., not applicable/none of the above).

Vision Critical 2013 www.visioncritical.com/university

Best Practice Brief


Survey Design and FAQs
There are many reasons why someone might not be able to answer a question:

Person knows they dont know. For example, What is the speed of the internet
connection to your home? is a question that many people do not know the answer to. Dont
know should be a response option.

Question is inconvenient to answer. For example, How many tins of food do you have in
your home? would normally require that people be at home while they do the survey and be
willing and able to go to the relevant room(s) to find out how many tins they had.

Person does not know their own motivations. For example, How important is taste
versus convenience versus price to you when buying tinned fruit? This sort of question
usually requires an indirect method of questioning such as conjoint analysis or regression to
estimate someones motivational structures.

Question is very personal. If youre asking sensitive or personal questions, explain why
and consider including a prefer not to say response option if you think people will leave the
survey because of the question. You can also help people save face by giving response
options that allow them to feel positive about themselves after responding.

Time frame is challenging to answer. For example, How many times did you purchase
coffee at Starbucks in the past 6 months? is not an easy question to answer. So make sure
the time frame is something a person can calculate or easily reference in their mind.

Persons answer is not in the list. Consider the different answer options; make sure you
give responders the right amount of choices. For example, in some cases Yes/No may not
be enough but it may be necessary to include Not sure. This also applies to scales where
we have to make sure we dont leave gaps. For Other options, try to include a Specify
text box where appropriate. This will give respondents a chance to be specific and give you
more robust data.

Answer options are double barreled. For example, Was the flight attendant service fast
and friendly? The service could have been friendly but not fast or vice versa.

Will the person be willing to answer this question?


Not everybody is willing to say how much they earn; even fewer are willing to say if they have
recently cheated on their partner. Making it clear that the data is anonymous and indicating why
the question is being asked will normally increase the number of people who are willing to
answer a question. Adding a Prefer not to answer can help avoid some respondents leaving
the survey when they reach a question they are unwilling to answer.

Will we be able to interpret the meaning behind the answer?


For example, the question, Was the train clean and on time? is fairly easy for someone to
answer. If the answer is Yes, the researcher knows what is being conveyed.
However, if the person answers with No, their train might have been late, or dirty, or both.

Vision Critical 2013 www.visioncritical.com/university

Best Practice Brief


Survey Design and FAQs
Have you asked the question before?
If youve already asked the question before and someones answer wouldnt have changed
since the last time you asked, you dont need to ask them the question again if youre running
an insight community.

Did members get asked for additional feedback?


Give the opportunity for additional feedback at the end of the survey. Having people pre-code
their answers for you will make analysis and sorting easier.
For example, ask four open ends instead of one:

Any feedback about the topics we covered today?


Any feedback about the survey design?
Any feedback about our customer service overall? (For a specific issue please contact 1.800
etc.)
Anything else you want to tell us?

These questions can be grouped together on one page for less click-through. Provide
respondents the option to skip over these questions if they have nothing to add by making these
questions not required in the properties section of the question.

Does the survey make sense when read out loud?


Reading back your survey to yourself, ideally out loud, is a sure way to catch anything that
reads strangely, makes you stumble or think twice. It will also help you tell whether the order
feels natural or not.

Did you preview the survey online and on mobile?


Surveys often look different once theyre programmed. Make sure to preview the survey online
and mobile before you launch.

Did you pre-test?


You could also consider doing a pre-test/soft launch prior to launching to everyone. This isnt
always necessary, but it can catch programming errors and other design errors before the
survey goes to a broader audience.

Vision Critical 2013 www.visioncritical.com/university

Best Practice Brief


Survey Design and FAQs

Survey FAQs
What do I need to know about writing a survey specifically for mobile?

Profile for smartphone ownership/usage on your community (can be done outside the
profiling questionnaire).
Communicate flexibility and choice in invitations to projects.
Maximize screen space by decreasing question and answer wording.
Design surveys from the ground up with mobile in mind. This means being careful about
scrolling, long lists, multiple choice questions, number of grid options, and use of images.
Be mindful of survey length. Mobile can take up to 50% longer to complete so keep
activities short.
Know which questions types are supported on mobile and desktop. For example, click
maps and videos are not supported on mobile, so make sure to carefully review the
experience for members before deploying.

What is the optimal survey length?


Ideally, surveys shouldnt be longer than about 5 minutes or roughly 20 questions for each
member. If they are longer than that you could consider splitting the survey into two. One of the
great benefits of having an insight community is the iterative process and ongoing relationship
you build with community members, so surveys dont need to be exhaustive.
That said, expectations should be set with members when they join the community and
potentially for individual projects if you think activities will regularly be time-intensive. Then each
person can decide if it is something they want to participate in and weigh their time against the
benefits of being in the community.

How long should video or audio clips be?


The length of video or audio, as well as the amount of time it takes to complete the full activity,
can vary depending on what expectations you set with the community members. If you tell them
theyre going to get to watch full length TV pilots, then theyll know it will take more time.
You should also consider the number of clips played in an activity. Just as with concept testing,
you dont want to overwhelm people with stimulus. The rule of thumb should be no more than
three short videos in an activity.
Consider what is a reasonable request based on what youre giving members in exchange both
intrinsically and extrinsically.

Vision Critical 2013 www.visioncritical.com/university

Best Practice Brief


Survey Design and FAQs
Im including a video clip in my survey. Do I need to provide instructions?
If you want to play a video in a survey, then consider giving people a heads up on the page
before the video. Ask them to check their sound and let them know they need to click play on
the next page (or that it will play automatically).

Can I start my survey with an open end question?


It is typically not a good idea to start a survey with an open ended question as it can intimidate
people, and they may decide to exit the survey. It is fine to ask one as a second question. So
think about an easy-to-answer closed question you can start with. One exception is when you
are trying to get a big picture overview answer from members before funneling their thinking into
one direction or another.

Should I send reminders?


Sending reminders should bump up your response rates by 10-25% but it really depends on
how much communication is going out the door to each member. For high volume communities
(e.g., an activity every week) sending reminders is usually not a good idea if everyone is getting
the activity invitation. Communicating once a week is already a lot.
For lighter communities, B2B, or very important projects, sending reminders are appropriate.

How long should surveys be in-field?


We recommend keeping surveys open for at least a week. If you need results in a hurry, you
can always pull in-field data. Just make sure you save the full data file as well as your tabs so
you have them stamped for the same time period. That said, some clients do run omnibus or
other quick turnaround type activities which they tell people will only be open for a day or two.
Regardless, the majority of people will likely complete your survey within the first 24 hours. After
that, the email invitation is usually far below the fold in their inbox.

Do I need to ask region and gender in every survey if I need it for analysis?
One of the reasons to have an insight community in the first place is so that you dont have to
ask these types of questions again. Put the key questions you want to use for sampling and
analysis in the profiling questionnaire. Its typically only dynamic information you would need to
ask again if you need it for analysis (e.g., number of children in the household, last time they
used a particular product, etc.).

Vision Critical 2013 www.visioncritical.com/university

Best Practice Brief


Survey Design and FAQs
What are important things to include in my invite?
The activity invitation should include a/an:

Interesting subject line


Personalized greeting
One sentence describing what the activity is about
Reason the activity is relevant to the person
Estimated length of time it should take to complete the survey
Close date
Link to activity
Incentive information
Contest rules

You could also consider including a couple of lines in the email P.S. about what you learned
from other recent activities, member quotes, interesting findings, sneak peeks, etc.

If I provide an Other option, do I need to include a Please specify box?


How much extra work is that for me to manage?
Adding an Other option or Other, please specify really depends on how sure you are that
your list is complete. If you are not sure, then include an Other, or in some situations consider
a None of these. Most clients ask for Other-specify, as opposed to just Other so they can see
how people are responding. Sometimes you get good nuggets, and sometimes you dont.
Ideally, if you go into a meeting with a high Other category, you should be able to give a sense
of what people said.

How many questions are good to have on one page?


Usually the standard is just to have one question on the page. But sometimes if there are
questions that fit nicely together (e.g., same topic, profiling information, etc.) or youre asking a
follow-up open-ended question, it makes more sense to put the questions together.

What is the purpose of doing a soft launch?


A soft launch or pre-test prior can catch programming errors and other design errors before the
survey goes to a broader audience. Its particularly important if you have a longer survey with
lots of skips or complex scripting or if you are unsure how people are going to respond to the
survey and you want to test drive the questions some more.

Vision Critical 2013 www.visioncritical.com/university

Das könnte Ihnen auch gefallen