Oftentimes I hear that surveys are not a good way to collect feedback from learners. Or that people are getting tired of answering surveys, which then leads to low response rates. Or people are not getting any insights or actionable data from their surveys. All of these reasons can give the perception that surveys are a waste of time.
How can we shift the perception of surveys?
I think one way is to eliminate poor survey questions. We often ask poor questions because that’s the way we have always done it. The questions may not seem that bad because we have gotten used to seeing them over and over again. However, taking the time to critique your current survey could eliminate some of these poor questions. Here is my list of five bad survey questions to consider dropping or revising:
On a scale of 1 to 10, how did you enjoy the catering at today’s session?
You have probably seen questions that ask the learners about the facility, classroom, or food. I am not sure any of these things have an impact on the learning that takes place. Yes, I agree that the environment does impact a person’s ability to receive the information, but how much control do you, as a learning leader, have over the food and classroom? And the key to any survey question is to know what you are going to do with the data. Do you currently share results back with the catering company or with your facilities team? If we cannot act on the information we gather, then we should instead ask other questions that we can act on.
The registration process for this course was efficient and effective. (strongly disagree to strongly agree)
This one is trying to collect data about how people registered for the training. But how much control does a learning experience designer have over the registration process? Often this is asking for feedback on the LMS or your organization’s learning technology, which we already know that is hard to change or improve your LMS or learning technology. Another problem with this question is the use of the word “and.” Asking a learner to provide feedback using two items can confuse the learner. What if the registration process was effective but not efficient? And use words that are easy to understand for the learner. They may have different ideas on what is efficient and effective. Instead, you could ask, What problems did you have registering for the course, or How did you learn about this training opportunity?
The trainer was effective and knowledgeable. (strongly disagree to strongly agree)
We do need to understand what part the trainer played in the learning experience. But this question is vague and leaves us with data that we cannot act on. What can we do if the trainer is not effective and not knowledgeable? This item does not tell us the behaviors that the trainer exhibited in the training. If you really want to know about the trainer, then use a behavior checklist and observe the trainer in the classroom. Or design a series of questions that lead to specific behaviors such as, The trainer encouraged participants to take part in class discussions, the trainer answered my questions, or the trainer gave me actionable feedback. If you don’t want to lengthen your current survey, then you can determine the effectiveness of the trainer by analyzing the comments from the question, How could this course be improved? And if you don’t have a way to quickly share the feedback with the trainer, then you should not collect the data in a survey.
The content of the course met the objectives. (strongly disagree to strongly agree)
We often ask this question to determine if a course achieved the learning objectives. However, learners often do not know the objectives of the course. And even if the trainer explained the objectives at the start of the training, how would a learner know if they met or achieved those objectives? One way to determine if a course met the objectives is to design a test, and use the test score to determine how well the course met the objectives. We often use surveys to collect data because we cannot use a test. Instead of asking about the objectives, you can ask the learner to note one thing they learned that they know they will use. You can then analyze the comments to see how well they match up with the objectives.
Overall, how satisfied are you with this learning experience? (not satisfied to very satisfied)
We often ask this question to determine if the course met the needs of the learner. However, when we use the word “satisfied,” it leaves much of the meaning up to the learner. The learner could interpret satisfaction to mean the training was engaging, or useful, or relevant, or meaningful or if they liked the people in the classroom. What if the learning experience was compliance training? How satisfied were you with your last compliance training experience? We are usually happy just to be done with compliance training.
So, what can you ask instead? We need to ask questions that determine the effectiveness of the training. Effectiveness can be determined by rating agreement to items such as, The training will help me be more successful on the job, my job performance will improve as a result of this training, or this training was worth the time away from my job. Learners value a learning experience that is useful, helps improve their performance, and is a good use of their time.