When it comes to employee surveys, a key element often goes under-appreciated: Designing an experience for humans. Survey experience has a strong impact on results, and many (if not most) surveys are resting on design principles from decades ago. Any professional administering employee surveys should be using them above all to measure the truth with a minimum of bias. This means getting up to date with how the current workforce is thinking and re-evaluating how surveys are built.
Another major measure of effectiveness for the survey itself is the participation rate. This is influenced by design too, since people can easily log in and get instantly discouraged by a poorly constructed survey or bail out halfway through. On the positive side, International Truck and Engine Corporation revised its annual workplace survey using science-based guidelines from Harvard Business Review and was able to move from a response rate of 33 percent to 66 percent.
Here are our seven top recommendations for creating an effective survey, based on human-centered science and design insights.
Target content carefully
To avoid bias, questions should be about specific, observable behavior from first-hand experience. Anchoring responses in real-world situations keeps participants grounded and focused instead of relying on their emotional impression of a situation. For example, when evaluating a leader, move from language such as “Does [leader] understand the market landscape?” to “[Leader] provides important competitor information regularly to my team” with a rubric of agreement options.
When building the content of a survey, identify aspects of performance that stand out as needing attention, but then drive to research what behaviors are underlying those problems. If a department is having trouble making their KPIs, dig a little deeper—is it a communication issue or perhaps a problem of leadership style?
Survey setup matters
Science says the employee survey should take about 20 minutes. This isn’t merely about offering a comfortable, streamlined experience: If it takes much longer, only employees with more time on their hands will be able to complete it, thus biasing the answers. The data will also be inherently less complete and actionable because people will begin to get less thoughtful in their rush to finish.
In terms of visual design, keep formatting basic and uncomplicated within sections. Topic labels and boxes can make people think questions are related, so be mindful about visual styling. Ideally employees should address each question as a stand-alone moment to reflect.
Keeping the number of questions and their length/complexity similar also helps to avoid bias by spreading attention equally across topics. Consistency in general improves the validity of the survey.
Format rubrics to minimize bias
Although surveys are capturing qualitative information, they shouldn’t presume to understand employees’ feelings. Often surveys will give a 5-choice response rubric, each with associated words or phrases like, “exceeds expectations” or “far exceeds expectations.” Yet the science argues for using numbers on a continuum with a guiding word/phase such as “entirely agree” at each end and letting people place their inferences on that scale. In part, this is more effective because the answers are evenly mathematically distributed. Who can say what the distance between “exceeds expectations” and “far exceeds expectations” really is after all?
When using rubrics, an odd number of response options is best, because it allows people to select a middle or neutral choice. Make sure to include enough response options to capture a decent level of nuance, but not so many that results are confusing or difficult to analyze. In general, a 5- or 7-option scale is a safe bet. Including “Not Applicable” or “Don’t Know” options is also important to avoid cluttering the data with meaningless answers.
Questions which require ranking a list of items tend towards bias as people will remember the first and last items best and rank them most decisively. It also primes the first and last topics, so they are on the participant’s mind for the subsequent questions. This can skew the survey results and should generally be avoided.
Watch your language
For survey questions use plain, clear language and avoid terms that have strong associations with gender, ethnicity, religion, etc. For example, do not tie leadership effectiveness to stereotypically masculine terminology such as “[Leader] is bold and aggressive in tackling problems.” The workplace has a history of conflating stereotypically masculine features with leadership while punishing women for the same features, so a more neutral approach might be “[Leader] is effective at handling problems rapidly.”
It also turns out humans are generally agreeable on employee surveys: People tend to go along with questions, a trait which increases as they move through the survey and get impatient. Research shows that wording about a third of questions negatively is best to keep us guessing, since they require respondents to disagree. An example would be changing from “I have a firm understanding of the company vision.” to “The company vision is vague to me.” Word these items carefully though—a change from “complicated” to “uncomplicated” may not read!
Paper is not dead
When it comes to surveys, it isn’t necessary to go extremely high tech. Facebook found that simply asking people how long they intended to stay at the company was twice as accurate at predicting turnover as sophisticated machine-learning forecasts. In fact, because people can be paranoid, HBR says providing the option of paper surveys may still be the way to go to engender trust in the survey process. Duke Energy put electronic and paper versions of the same survey up against each other and found the paper-based data to be more reliable and valid. The electronic one, based on a company server, skewed strongly towards favorable answers.
Handle demographic questions with care
Demographic data can be a very sensitive topic. It’s no question that surveys are “confidential” but not really “anonymous” precisely because of demographic data questions. The details collected can make it entirely possible to track down who said what, so there is a big worry about trust. Trust at work comes from corporate culture at the high level, but lingering doubts can be combatted with reassuring messaging in the survey itself. Explaining why the data is being collected, and how it will be used shows transparency and helps participants feel more comfortable. In more delicate cases, surveys can even be administered and analyzed by a neutral third party.
With these demographic questions, always avoid being invasive or irrelevant. Doing so can dramatically damage response rates, and throw the entire participant experience off. The best practice is to put these items put at end, make them optional, and keep them to a minimum quantity.
Surveys influence behavior, so take that responsibility seriously
Facebook discovered another interesting phenomenon during their engagement survey: They asked a subset of employees if they were “personally committed to improving their experience [at work]” and found that the people they asked were 12 percent more likely to request resources and tools after the survey to become more engaged. This was true whether they responded yes or no to the question.
As it turns out, surveys can shape the way people behave and what topics are on their mind because we are fairly suggestible. Topics which are covered by a survey will stick with people, especially if asked in a personal way.
The lesson from this last point applies to all the guidelines we’ve shared here. When designing a survey, keep in mind that your goal is finding out the unbiased truth rather than steering the results in a rosy direction. Administering a survey comes with responsibility, but also a great opportunity to create positive change in your organization. Don’t waste it on a poorly designed survey.