Customer Surveys

Customer Surveys – The Definitive Guide

I have what might, at first, seem like a rather silly question for you: why do you want to use customer surveys?

The knee-jerk answer is usually something along the lines of “so I can gather more information about my customers” or “so I can better understand my customers.”

While these answers aren’t wrong, they are a bit shortsighted…

Yes, surveys are used as a tool to collect details about your customers. But this information is meaningless (as is the act of conducting the survey in the first place) if it’s not used to improve the way your company operates and provides services to your customers.

In other words, the survey is a means to an end – not an end in and of itself.

To help you out, we’ve created the ultimate guide for creating customer surveys. It was built to produce actionable insights, with the hopes that you’ll be able to use the customer feedback you receive to improve your products and services and grow your business.

In this guide we’ll dig into the major aspects of planning, developing, distributing and collecting responses for a customer survey. If you’ve never surveyed your customers before, you’re in the right place. We’ll show you everything you need to know.

Sound good? Let’s jump in…

(Note: You can apply everything you’ll learn in this guide by creating a beautiful, customized survey with Fieldboom. It’s free to get started and you’ll gather the insights you need to improve your product, increase your sales and keep your customers happy.)


The Planning Stage

Before you even begin designing customer surveys and developing questions for them, you’ll have a few things to consider:

  • What are your goals for conducting the survey? Which questions do you want answered?
  • What tools will you use for creating and delivering your survey?
  • How do you plan on distributing your survey?
  • Let’s take a look at each of these questions in greater detail.

Defining Goals For Your Customer Survey

Just as you do with all other campaigns and initiatives you undertake within your business, you need to have a clear-cut goal whenever you decide to conduct a survey.

But defining (and refining) your goal for a survey isn’t as simple as saying you “want to learn more about your customers” or “improve your customer service.”

Holding onto such vague goals will, consequently, lead to vague conclusions once you’ve collected information from your customers. In turn, it will be nearly impossible to use this information to make any sort of meaningful changes to your company’s operations.

So, begin by creating a goal that will act as the foundation for the development of your entire survey. The more specific you are when defining the goal of your customer surveys, the more likely you are to gather information that will actually help your organization.

A few examples:

  • Determine which aspects of our service our customers find most valuable, so we can focus on improving these areas quickly
  • Get feedback from customers who have purchased our products online, so we can determine if the checkout process is streamlined or not
  • Learn about the needs of our customers that we aren’t meeting, so we can improve the overall service we provide

Now, you might already have such a specific goal in mind, but you might not. If you don’t, there are a few questions you can ask yourself to help narrow your focus:

  • What do I want to learn?
  • Who do I want to ask?
  • What am I going to do with the information I learn?

Let’s look back at one of the examples from above:

Get feedback from customers who have purchased our products online, so we can determine if the checkout process is streamlined or not.

This goal clearly answers each of the questions above:

  • They want to learn about the quality of their online checkout process
  • They want to ask only customers who have shopped at their online store
  • They want to streamline the checkout process for their online shoppers

Setting such a definitive goal makes it much easier to create a survey with a clear focus, which, in turn, increases the chances that the information gathered from the survey will be incredibly useful to your organization.

It also makes it easier to develop your survey, and increases the likelihood of generating responses from your customers. More on that later.

Back to top.


Customer Survey Software & Tools

You probably aren’t going to be creating your survey by hand, right?

And you almost certainly aren’t going to be manually collecting, sorting, and analyzing your data, either.

To make the survey process go as smoothly as possible – both for yourself and for your survey respondents – you’ll want to use tools that:

  • Are easy to use
  • Offer the necessary features and capabilities
  • Are cost-efficient

Let’s briefly talk about each of these factors in terms of both your organization and your customers.

(Note: Fieldboom is a great choice for creating your customer surveys. Sign up free.)

Ease Of Use

This is probably a no-brainer, but the entire reason technology and tools exist in the first place is to make processes easier.

So, when looking for a customer survey-creation tool, you want to be sure it’s actually going to enable you to do what you need to do without adding any frustration to the mix.

More importantly, though, you absolutely want to be sure that your survey is easy for your customers to complete. If they face any kind of glitches while doing so, they’re much more likely to abandon the survey altogether than they are to stick around and figure out what the problem is.

Make sure the survey software you decide to use makes it easy for you to create your survey and even easier for your customers to complete it.

Features (For You)

On the back-end of things, the survey tool you use should allow for a number of customization options, such as the use of various question types and the implementation of survey logic (we’ll get more into these things later).

Such customization will allow you to tweak your survey to ensure that you can actually ask the question you want to ask. On the other hand, a tool that only allows you to create surveys with generic question types and the like will make it more difficult to actually reach the goal you’ve already set for your survey.

Also, the tool you use should automate the process of collecting and organizing data in a variety of forms. This will allow you to easily view the important data you’ve been seeking in a way that is most intuitive and understandable to your organization.

One last thing to consider is whether your tool of choice offers the ability to automate receipt confirmation and follow-up messages to your respondents. Keep this in mind for later on.

Before you pull the trigger and purchase a survey-design tool, make sure it does exactly what you need it to do. But be prudent, as well. An all-encompassing survey tool might actually offer too much for your current needs, and may end up costing you more than it was worth.

Features (For Your Audience)

On the front end, you want to be able to design your survey so that it’s aesthetically pleasing to your audience.

Make sure the tool you choose allows you to customize the look and feel of your survey, so that your customers aren’t forced to stare at an ugly, poorly-designed form for your benefit.

In terms of what to look for, make sure your chosen survey tool allows you to align your survey with your brand’s color theme and overall style. It may seem trivial, but offering your customers a survey that looks and feels like it’s truly a part of the overall customer experience can make respondents more apt to engage with and fully complete your survey.

Other Resources

Before you dive into the process of creating and delivering your customer survey, you want to have a pretty good idea of what it will cost you in terms of resources such as time, money, and manpower.

To do so, you’ll need to think realistically about everything that conducting a customer survey involves:

  • Survey Creation & Delivery: From question creation to design, actually creating a survey can (and should) be a time and resource consuming process. Doing so involves input from a number of specialists, perhaps some of which you’ll be hiring on a freelance basis. Make sure you plan ahead to ensure your team has ample time to focus on developing your survey, or that your company can afford to hire out if needed.
  • Responding to Surveys: As we’ll talk about a bit later on, responding to customers once they submit their responses is incredibly important – especially if they report having had a negative experience. Make sure you have resources set aside to communicate with respondents immediately, should the need arise.
  • Analyzing Results & Making Improvements: After you’ve conducted a survey, your work has only just begun. You’ll then need to dig into the data your customers have provided to determine how you can improve your company’s operations to better serve them. Of course, making these improvements will inherently require heavy resource allocation. Make sure you’re prepared to make this step to avoid having wasted all the time, money, and energy you’ve utilized up until this point.

Back to top.


Planning Your Customer Surveys For Distribution

The way in which you distribute your customer surveys depends largely on the answer to one question:

Where are your customers most active?

In other words, think about where and how your customers are most likely to engage with and complete your survey.

The most common means of delivering customer surveys are:

  • Online (via company website)
  • Email
  • Phone
  • In Person

But just because you deliver your survey through a given channel doesn’t mean you can’t use other channels to point your audience to it. For example, if you make your survey available on your website, you can, of course, include a link to the page within email and social media communications (and other channels).

This guide will focus mainly on strategies and tips for creating web-based customer surveys.

(Note: You can apply everything you’ve learned in this guide by creating a beautiful, customized survey with Fieldboom. It’s free to get started and you’ll gather the insights you need to improve your product, increase your sales and keep your customers happy.)

Back to top.


Effective Survey Design

In the last section, we talked a bit about survey design in terms of aesthetic appearance and overall functionality.

In this section, we’ll talk more about how to structure your customer survey in a way that makes it easy for your audience to engage with it.

Let’s start at the very beginning.

Customer Survey Introduction

As with any piece of deliverable content, your survey absolutely needs an introduction. A strong survey introduction includes the following elements:

  • Company Info: A short description of your company and its services. This serves to refresh your customer’s memory, and builds a frame for the rest of the survey.
  • Survey Goals: Explain why you’re conducting the survey. While you can describe these goals from your company’s perspective, you should also explain how respondents will ultimately benefit from your conducting the survey.
  • Data-Tracking Information: Tell respondents how the information they provide will be tracked. Explain whether they’re expected to provide identifying information, or if the survey will be conducted anonymously.
  • Survey Instructions: Provide a short explanation for how to fill out the survey, as well as how to submit the form. Be clear and concise with your instructions – but also provide the option for respondents to skip the instructions if they aren’t needed.
  • Survey Length: Explain how long it typically takes respondents to complete the survey. This will give your customers a good idea of how much time they should set aside – and will cut down on instances of abandonment.
  • Privacy Statement & Consent Form: Regardless of whether the survey is conducted anonymously or not, make sure to include information regarding how the submitted information will be used (as well as what information will be used). Be sure to include a consent form, as well. This form could be as simple as a checkbox next to a statement of consent.

Though it might seem like a rather inconsequential part of the overall survey, a well-designed introduction serves many purposes:

  • It gets your customers really thinking about their experiences with your brand
  • It empowers them by explaining that their opinion is truly valued, and will help improve the services your company provides
  • It explains what the survey will be asking of them in terms of information provided, as well as time and energy being spent

Now, let’s look at how to structure your survey’s questions.

A Note On Survey Length

In a perfect world, we’d be able to provide you with an “ideal” length for your customer survey.

However, because so many variables exist from one response instance to another, it’s not exactly possible to provide a finite answer to this question.

That being said, the typical customer probably isn’t going to spend more than a few minutes filling out your survey. And they definitely won’t want to fill out a survey that asks a bunch of superfluous or otherwise unimportant questions.

So, the best advice to take when determining the length of your survey is:

Make it as long as it needs to be – and no more. Get right to the heart of the matter by ensuring each question (and the answers respondents provide) relate directly to your goal for conducting the survey.

One thing to note, though, is that the longer your survey is, the less time respondents will spend on individual questions. The less time they spend thinking about their answers, the less valid these answers become.

The trick, then, is to find a balance in terms of survey length that leads respondents to spend an amount of time on each question that allows them to answer each sufficiently, but not so much time that they end up rushing through the last half of your survey.

Survey Question Order

In the next section, we’ll dive into the actual creation and writing of survey questions.

But first, let’s talk about how the order in which these questions appear can influence the way in which your customers answer them – and what you can do to mitigate such bias.

Broad To Specific Or Vice-Versa?

Your survey will undoubtedly include some questions that deal more with your customer’s overall experience, and some that deal with specific aspects of such.

As you may have guessed, the order in which you ask these questions definitely has an effect on how your customers will respond.

There’s been ongoing debate about whether to begin customer surveys by asking broad questions first, like so:

  • How would you describe your overall experience with XYZ company?
  • How welcome XYZ’s staff make you feel?
  • How would you describe XYZ’s customer service?
  • How would you describe XYZ’s checkout process?

Or to begin with the specific questions, and end the customer survey with an overarching question, like so:

  • How welcome XYZ’s staff make you feel?
  • How would you describe XYZ’s customer service?
  • How would you describe XYZ’s checkout process?
  • How would you describe your overall experience with XYZ company?

There are two schools of thought that validate either of these methods.

On the one hand, asking more general questions first leads respondents to think about their overall experience in more holistic terms. In other words, they’ll first think about how the entire experience made them feel – without breaking down the experience into specifics.

On the other hand, asking specific questions first will lead respondents to think about their experience in more procedural terms, all of which eventually combine to form the entire experience.

However, certain issues can arise no matter which way you decide to go.

Customers responding to a broad-to-specific survey may frame their answers to specific questions in terms of how they answered the broad ones – or they might not be entirely honest with their more specific answers.

For example, if they report that their overall experience was “very pleasant,” they might subconsciously convince themselves that all of the more specific aspects of the service were “excellent,” even if some of these areas were lacking.

Customers responding to a specific-to-broad survey can also be influenced by their previous responses, as well, in that their answers to the specific questions might cause them to go “against their gut” when answering the general question.

For instance, let’s say a customer initially felt they had a negative overall experience. If they responded rather positively to the specific questions that were asked, this might cause them to second guess their initial response. Of course, there may be other factors that the survey didn’t address that led to this customer’s negative experience – but, on paper, it will appear as though their experience was overall pretty positive.

As it was with survey length, there’s no hard-and-fast solution to this conundrum. Rather, you just need to know that these biases exist – and keep them in mind when digging into your survey’s results later on. In turn, you’ll be better able to make more sense out of responses that, at first glance, appear to contradict each other, or that are otherwise rather anomalous.

Answer Order

The order in which you list your answers can also influence your customers’ responses

This comes into play most often when asking multiple-answer questions (which we’ll talk about in a bit). For example, say you ask the question, “What do you enjoy most about our service?” and provide the following choices:

  • Customer Service
  • Atmosphere
  • Pricing
  • Products
  • Other

While not an absolute certainty, respondents may simply choose the first answer they see that jumps out at them – even if another choice would be more accurate. The larger the number of respondents who fall into this bias, the less reliable your overall results will be.

To mitigate this issue, you can randomize the order in which the answers are listed for this type of question. While this doesn’t eliminate the bias in individual respondents, per se, it will at least make it so the bias doesn’t result in a focus on one single answer across the board.

Two things to note:

  • Randomization only works with the above type of question. Don’t implement it within questions dealing with scaled responses
  • Don’t include answers such as “other” or “not applicable” in the randomization. These answers should always be at the end of the list

Survey Logic

Earlier on, we mentioned the term “survey logic,” and that, in order to ensure that your customers’ responses are as accurate as possible, you need to make sure the tool you use to conduct your surveys allows you to implement this strategy.

Survey logic is implemented in one of two ways:

  • Branches
  • Filtering

Branching (also known as “conditional branching” or “branch logic”) allows survey creators to send respondents to specific survey questions or section based on their answers to a former question.

Say, for example, a respondent reports that a company’s customer service was most influential in determining the quality of their overall experience. The survey creator would be able to create more specific questions regarding aspects of their customer service, and then use branching logic to bring this respondent to these more specific questions. Respondents who answered differently would, of course, be brought to a different set of questions.

Filtering is similar to branching in that it allows subsequent questions to be asked based on a respondent’s answers. However, filtering also dynamically skips questions that don’t apply to certain respondents based upon their previous answers.

For example, a customer survey might ask, “Did you ask for assistance from customer service during your shopping experience?” If the respondent answers “yes,” they’ll be brought to a set of questions about the customer service department’s ability to help them. If they answer “no,” they’ll simply be brought to the next survey question.

Survey logic is beneficial for a number of reasons:

  • It allows you to probe deeper into a specific issue “on the fly”
  • It shortens surveys for respondents by eliminating unnecessary questions
  • It creates a more personalized for each of your respondents

All of these factors increase the chances that your customers will complete and return your survey – and increase the validity of their responses, as well.

Wrapping Up Your Customer Survey

Once your respondents have completed the “meat” of your survey, you want to metaphorically put a bow on the experience for them.

First things first, use the final page of the survey to thank them for taking the time to help your company improve its services. This small show of gratitude humanizes the experience, and reminds your audience that your company truly does value their input.

On this same page, provide respondents with added value or a surprise offer. You might choose to give them a coupon for free or discounted service, or you might provide them with complimentary downloadable content, such as an ebook or ultimate guide.

Lastly, if applicable, ask for their contact and demographic information. The reason to do this last is because their survey responses will have already been submitted whether or not they choose to provide this information. On the other hand, if you were to ask for this information first, anyone who didn’t want to do so wouldn’t fill out the survey at all.

For those that do submit their email address or other contact information, make sure you send them an additional “thank you” message as a follow-up.

When it comes to figuring out what your customers want to get out of your products or services, one of the most efficient ways of doing so is to simply ask them.

Okay, okay. So it’s not quite that easy.

While you might at first assume that most customers would jump at the chance to have their voices heard, in truth the vast majority of consumers don’t respond to customer surveys of any kind. Surprisingly, a response rate of 10-15% is actually pretty decent; a response rate of just 2% is reportedly the norm across most industries.

There are a number of reasons the response rate of customer surveys is so low, such as:This multimedia learning experience features six chapters, over 30,000 words, and a mixture of videos, infographics, guides, podcast episodes, and more—all designed to deliver you a comprehensive and unparalleled landing page education—and all for free.

Whether you’re a marketing veteran in need of new strategies and a good refresher, or you’re just getting started marketing with landing pages—you’re in the right place.

(Note: You can apply everything you’ve learned in this guide by creating a beautiful, customized survey with Fieldboom. It’s free to get started and you’ll gather the insights you need to improve your product, increase your sales and keep your customers happy.)

Back to top.

Writing Customer Survey Questions

Writing Customer Survey Questions

Now, we’re going to dig into perhaps the most critical part of survey creation:

Writing the survey’s questions.

First, we’ll talk about the types of questions you might choose to include in your customer survey. Then we’ll discuss why the way in which you word these questions is so important and go over some of the most common mistakes made when creating survey questions.

Let’s start with survey question types.

Types Of Survey Questions

There are many different ways you can ask questions regarding your customer’s experience with your brand.

But, because you’ll want to set certain parameters for your respondents’ answers, you’ll want to ask certain questions in certain ways.

Here, we’ll discuss the types of questions you might choose to ask, as well as the possible responses you might include to accompany these questions.

Open & Close-Ended Questions

No matter what, every question you ask within your survey will either be open- or close-ended.

Close-ended questions provide specific answer choices for respondents to choose from.

Examples include:

  • “On a scale of 1-7 (7 being highest), how would you rate our customer service?”
  • “Did you make your purchase in person or online?”
  • “Of the following choices, which most influenced your decision to make a purchase?” 

Close-ended questions are inherently quantifiable, in that responses to a single question can be tallied across a customer base to assess where the majority of a company’s customers stand on a certain issue.

Of course, close-ended questions don’t offer respondents the opportunity to clarify their answers. Which is where open-ended questions come in.

Open-ended questions allow respondents to use their own words within their answer. Examples of open-ended questions include:

  • “Explain what you liked best about our service.”
  • “Why did you chose x as the most important factor in your buying decision?”
  • “Was there anything the survey didn’t mention that you want to discuss? If so, explain.” 

While open-ended questions can provide a more complete look into a customer’s experience, it’s best to use them sparingly.

On the respondent’s end, open-ended questions take much longer to answer – and they require more effort to complete, as well. Because of this, respondents often choose not to answer open-ended questions.

On the surveyor’s end, responses to open-ended questions aren’t quantifiable – which means they take much more time to analyze and understand. Whereas responses to close-ended questions can be quickly categorized (and can even be done automatically via survey software), open-ended responses need to actually be read by the surveyor in order to be understood.

That being said, open-ended questions are best used to accompany close-ended questions, rather than to stand by themselves. For example, you might choose to provide respondents the option to explain their answer to a given question. Again: make it clear that responding to these open-ended questions is optional; otherwise, you run the risk of overwhelming your respondents, leading them to abandon the survey altogether.

Single Answer Versus Multiple Answer Questions 

For these questions, respondents will only be able to choose one answer. The answers to these questions are typically either opposites of each other or scaled responses. For example, for the question “Was your most recent purchase made online or in person?”, only one of the answers can be true. Or, if a question asks respondents to rate the company’s checkout process on a scale of 1-7, they would need to choose a single rating.

Another possibility would be a question in which respondents are asked to choose what they consider to be the most important aspect of the company’s service.

In contrast, multiple-answer questions (often called “checkbox questions”) are those which, naturally, allow respondents to choose more than one answer.

For example, you might ask your customers how they’ve interacted with your company in the past, providing answers such as “Social Media,” “Website,” “Storefront,” etc. Respondents who have engaged with your brand in more than one way would then be able to choose every answer that applies to them.

To further illustrate the difference between the two, the above question in single-answer form would be “Which channel do you use most when engaging with our company?” In this case, only a single answer would provide the information the surveyor is looking for.

Forced Versus Neutral Questions 

The way in which answer options are provided for certain survey questions can do one of two things:

  • Force respondents to “pick a side”
  • Allow respondents to choose a neutral answer

give their opinion on a certain topic. Forced-response questions will provide an even number of choices, while neutral-response questions provide an odd number.

Note that, unlike in the example mentioned above, the wording of the question need not change.

Consider the survey question, “Agree or disagree with the following statement: “Our customer service department was helpful.”

Forced-response options would be as follows:

  • Strongly disagree
  • Disagree
  • Agree
  • Strongly agree

On the other hand, neutral-response answers would be as follows:

  • Strongly disagree
  • Disagree
  • No opinion
  • Agree
  • Strongly agree

Either option, of course, has its pros and cons.

Clearly, forced-response questions force respondents to decide: was the customer service helpful or not?

On the one hand, this can mitigate instances in which respondents wish to choose the neutral option in lieu of skipping the answer entirely, or in which they truly want to answer negatively but perhaps don’t want to “ruffle any feathers.”

On the other hand, perhaps respondents truly were indifferent to the customer service they received – and now have to choose a side whether they actually believe their choice or not. Of course, they could always skip the question altogether – which provides you with absolutely no information at all.

However, it’s also important to realize that a neutral answer is not a null answer. In other words, there’s a lot to glean from a neutral response.

Think about the customer who reports having “no opinion” on how your customer service department affected their experience. While it doesn’t seem as if anything’s going wrong in this area, there isn’t much to celebrate, either. In other words, a neutral response can be a sign that you have room for improvement in a certain area if you want to be able to “wow” your customers.

A Quick Note On Net Promoter Score

Net Promoter Score measures a customer’s propensity to recommend your services to people in their network.

NPS is derived by the following steps:

  • Asking customers the question: “On a scale of 0-10, how likely are you to recommend our brand to your friends, family members, or colleagues?”
  • Defining Promoters, Passives, and Detractors using the following criteria: Response of 9-10: Promoter Response of 7-8: Passive Response of 0-6: Detractor
  • Determining the percentage of responses defined as Promoters and Detractors
  • Subtracting the percentage of Detractors from the percentage of Promoters

your brand’s perceived value, as well as how you stack up against other companies within your industry.

Mistakes To Avoid When Writing Customer Survey Questions

As we’ve alluded to throughout this (and previous) sections, the way in which you word your survey questions can influence your respondents’ answers heavily.

Your wording can also nullify their responses entirely.

at length before, but let’s briefly review some of the most common (and detrimental) mistakes that can be made when writing survey questions.

Double-Barreled Questions

Double-barreled questions are those that mention two topics at the same time, creating a dilemma for respondents.

For example, consider the question (or rather, statement) “The checkout process was quick and easy.” For the respondent to “strongly agree” to this statement, the checkout process would have had to be both quick and easy for them. If it was easy, but it took more time than expected, the respondent’s truthful answer should be “strongly disagree.”

On the surveyor’s end, there’s no way to understand the meaning behind a negative response to such a question. Was the checkout process quick but not easy? Was it easy but not quick? Was it slow and difficult? It’s impossible to tell without reaching back out to the respondent – rendering the initial survey’s results moot.

Double-barreled questions can easily be fixed by simply breaking them into two questions. This ensures your respondents won’t be confused by the question, and that their answers will address one topic only.

Leading Questions

Leading questions, intentionally or not, make respondents feel as if there’s a certain “right” answer to the question at hand.

In the “real world,” leading questions are used all the time, like “You don’t want to miss out on a great deal, do you?” or “That was a great meal, don’t you think?”

Those are pretty obvious examples of leading questions that are pretty easy to avoid when creating survey questions.

But leading questions can be much more subtle than that. For example, the question “How high would you rate our customer service?” plants a seed in the respondent’s mind that the service was high in quality, and it’s just a matter of how great the service was.

You can fix a leading question by removing any semblance of quality from it. Using the previous example, you’d simply ask: “How would you rate our customer service?” You’d then provide a Likert scale, defining 1 as “Very Poor” and 7 as “Very Good.”

Loaded Questions

A loaded question makes an assumption about the customer’s experience without clarifying such, then asks a question based on this assumption.

For example, the question “How would you define your interaction with our customer service team?” makes the assumption that the respondent, indeed, interacted with the company’s customer service team.

In such instances, respondents might skip the question, or they might simply choose the “neutral” or “not applicable” option. But, going back to what we talked about earlier, this doesn’t give the surveyor much information. Is the respondent saying they have no opinion about the quality of service the team provided? Or did they not interact with the team at all? Again, there’s no way to immediately tell either way.

To avoid loaded questions, you’ll first need to ask a qualifying question (e.g., “Did you engage with our customer service team?”). Then, depending on the respondent’s answer, you can use branching or skip logic to either ask a follow-up question regarding the quality of service provided or move on to the next question.

this post on our blog to learn more about how to avoid making mistakes that could render your entire customer survey useless.

(Note: You can apply everything you’ve learned in this guide by creating a beautiful, customized survey with Fieldboom. It’s free to get started and you’ll gather the insights you need to improve your product, increase your sales and keep your customers happy.)

Back to top.


Collecting & Analyzing Customer Survey Responses

Once you’ve distributed your customer survey and have begun receiving responses, you can begin the process of categorizing, analyzing, and assessing the information you’ve collected.

In this section, we’ll discuss:

  • How to determine what constitutes a statistically significant sample size
  • How to address negative feedback
  • What to do with the data you collect
  • How soon you should send your next survey

We’ll begin by discussing the factors that determine statistical significance.

Factors Determining Statistical Significance

Understanding statistical significance is incredibly important in order to collect data that’s actually meaningful and actionable.

Think about it – if you send a survey to 1,000 customers, and only 10 reply, there’s no way you’d be able to extrapolate and apply those ten responses to your entire customer base.

On the other hand, if 900 out of 1,000 customers replied, you could be confident that these responses would be a pretty decent representative of the whole population.

Statistical significance is determined by the following factors:

  • Population Size: The total amount of individuals you plan on sending your survey to
  • Margin of Error: A percentage that describes how much the data collected from your sample size might deviate from the opinions of the entire population. The closer your sample size in number to your actual population size, the higher the probability that the data collected accurately represents the whole – resulting in a lower margin of error.
  • Confidence Level: Represents your confidence (in percent out of 100) that, if you were to repeat the survey with different individuals within the same population, that the results would be at least similar – if not exactly the same. Generally, confidence levels are set at 90%, 95%, or 99%, depending on the importance of the survey.
  • Sample Size: The total amount of surveys you need to get back in order for your survey to be statistically significant

Addressing Negative Feedback

We talked a bit about this before, but to reiterate:

While it’s important to respond to all of your respondents through automation, it’s essential that you reach out to those who responded negatively to your survey personally and immediately.

After determining the problem (by analyzing the customer’s survey responses), reach out to them in order to:

  • Apologize for the inconvenience or lack of service provided
  • Empathize with their specific situation
  • Provide options for them for how you’ll fix the problem
  • Deliver the “fix”
  • Provide added value, such as a full refund or voucher for discounted services in the future

By responding to unsatisfied customers as quickly as possible – and remedying their issue in an efficient manner – you increase the chances that they’ll give your company another chance. Otherwise, you run the risk of losing them for good.

Using The Data To Improve Your Products & Services

We talked about this in the earlier section on defining goals for your survey.

But it’s much easier to know exactly what you need to do to improve your company’s operations once you actually have concrete data from your customers in front of you.

The moment you reach statistical significance, you should begin thinking about what the data is telling you, and where you need to go moving forward.

Quick side note: While you may want to start making moves from the moment you begin collecting data, it’s important to wait until you’ve reached statistical significance – otherwise you may end up making moves in the wrong direction.

Now, this doesn’t mean you need to have a solidified plan of attack immediately. But, at this point, you should at least have a general idea of:

  • What areas of your service you need to focus on
  • Who will be involved within your company when making these improvements
  • What other resources will be needed to make these improvements
  • How long it will take to begin seeing results

As you collect more and more data, you’ll be better able to create a concrete plan. For now, stay flexible.

Thinking Ahead To The Next Customer Survey

Once you’ve collected all the data you expect to collect, have begun implementing changes to your operations, and have started to see results, you can start thinking about sending out a subsequent survey to the same population segment.

The general rule of thumb is you should wait about six months before sending out the next survey. This will not only give you time to implement the improvements you’ve determined need to be made, but it will also give your customers time to experience these improvements, as well.

Now, this six-month period isn’t a hard-and-fast rule, as some industries are more volatile and subject to fluctuation than others. For example, the needs of consumers within the electronics industry change with every new product update. On the other hand, the needs of hotel guests remain relatively steady until major industry overhauls occur.

That being said, no matter how volatile your specific industry is, you should never send a single population segment more than one survey every two months. The more surveys you send, the less you can expect to get back – and the lower the validity of each survey received becomes.

Back to top.

It Pays (Literally) To Spend Time Crafting Well-Designed Customer Surveys

So, there you have it – our blueprint to take your customer surveys from ideation to implementation and beyond.

Sure, it might take a little longer to follow the steps we’ve outlined here, but it does guarantee that you’ll ask the right questions, get insightful answers and most importantly, be able to use that feedback to keep growing your business in a way that creates happy and satisfied customers who will tell their friends and family about your products and services.

And at the end of the day, nothing will grow your business faster than WOM (or Word Of Mouth) marketing. It’s free and amazingly effective.

Good luck! 🚀🚀🚀

Get Started Now

You can apply everything you’ve learned in this guide by creating a beautiful, customized survey with Fieldboom. It’s free to get started and you’ll gather the insights you need to improve your product, increase your sales and keep your customers happy.