Survey best practices

A well-designed survey produces actionable insights. A poorly designed one produces data you can't use. The difference usually comes down to planning, question design, timing, and what you do with the results.


Start with the outcome, not the questions

Before writing a single question, define what decision you want to make based on the survey results. If you don't have a clear answer to "what will I do differently based on what I learn?", you're not ready to build the survey yet.

Work backwards from the outcome: what do you need to know to make that decision? What's the minimum set of questions that gets you there? A focused survey with eight relevant questions outperforms a comprehensive one with twenty questions where half are nice-to-know.


Design questions that produce useful answers

  • Ask one thing per question. Double-barrelled questions ("How satisfied are you with the price and quality?") produce answers you can't interpret cleanly.
  • Avoid leading questions. "How much did you enjoy our event?" assumes they enjoyed it.
  • Use scales consistently. If you use 1-5 for one question, don't switch to 1-10 for the next.
  • Make answer options exhaustive and mutually exclusive. Every respondent should be able to find their answer without it overlapping with another option.
  • Include an "Other" or "Not applicable" option where relevant. Forcing a choice from an incomplete list produces inaccurate data.

Use scenarios to reduce friction

If some questions only apply to certain respondents, use scenarios to skip irrelevant questions automatically. A respondent who gets a shorter, relevant survey is more likely to complete it and give more thoughtful answers than one who has to wade through questions that don't apply to them.


Send at the right moment

Surveys tied to a specific experience, an event, a purchase, a support interaction, should be sent as close to that experience as possible. Response rates and answer quality both drop significantly as time passes. Aim for within 24-48 hours.

For periodic feedback surveys (quarterly satisfaction, annual NPS), pick a consistent sending schedule and stick to it so you can track trends over time. Changing the timing between rounds makes comparisons unreliable.


Connect surveys to your workflows

A survey completion can trigger a workflow. When a contact finishes your post-event survey, a thank-you email goes out automatically. When a contact gives a low satisfaction score, an alert goes to your customer success team. When someone answers "Yes" to "Would you recommend us?", they're added to a referral interest for a follow-up campaign.

This kind of automation makes surveys a living part of your marketing process rather than a periodic manual exercise.


Act on what you learn

A survey that generates data nobody acts on signals to contacts that their feedback doesn't matter. When you've analysed the results, communicate back to respondents: what you found, what you're doing about it, and when they can expect to see changes. Closing the feedback loop is what turns a one-off survey into an ongoing relationship.


Pro tips

  • Review your active surveys every six months. Question wording that made sense when you created the survey may no longer reflect how you talk about your products or services. Outdated phrasing can subtly skew results without you noticing.
  • Test the survey yourself before sending it. Check that every routing path works, that pre-filled fields populate correctly, and that the closing text appears after the final question.
  • For NPS or satisfaction surveys, track results over time by running the same survey across multiple campaigns and using the comparative report to spot trends.

Next steps


Looking for more inspiration?
How can I apply this in my account?
I want to know more about this
Did this answer your question? Thanks for your feedback There was a problem submitting your feedback. Please try again later.

Didn't find what you were looking for? Contact Us Contact Us