Surveys Sins

I feel like I’ve completed a lot of surveys lately.

Surveys can be a really effective way to gather information easily, especially if they are online. Unfortunately, it seems that the easy-to-use technology allows people to forget that they are asking a lot of their respondents. In this post, I am going to look at some examples of surveys I’ve been asked to complete lately, in order to understand what is bad.

Screen Shot 2015-08-01 at 1.46.42 AM

Example A: This survey was sent out to hundreds of physics teachers on a listserv, asking them to help with doctoral research.

Sin 1: Incomprehensibility. Here, the first question is misplaced (it should come after the fourth question, I think).

Sin 2: Question types not matching the information being requested. Note above the use of small text fields for questions asking for numbers (which is fine), but then the use of the same small text field for a question asking the respondent to provide an elaborated, “please describe”, response. A similar sin is confusing radio buttons (circles – choose one) and checkboxes (squares – choose zero or more).

Sin 3: Asking irrelevant questions. The question about gender, above, doesn’t seem to be relevant to the analysis (there are no further demographic questions in the study).

Sin 4: Making questions required (especially demographics). It is a cornerstone of modern research that human studies need to be voluntary, and that participants have the right to withhold information and skip questions, especially about demographics. Making questions “required” violates this expectation.

Sin 5: Surveys that are too long. The survey above has 64 Lickert-style questions (see here). Honest, thoughtful answers will require much more than the 15 minutes suggested in the introduction. Too-long surveys inconvenience the respondent, and risk invalidating the study by causing respondents to quit, or stop taking the survey seriously.

Screen Shot 2015-08-01 at 1.39.15 AM

Example B: This survey was sent to teachers and seems to be designed to assess how teachers’ understanding of student misconceptions evolves as a result of professional development.

Sin 6: Poorly-designed questions. The above question seems to have been written following a misconception (that superconductivity means faster electric flow), but the question doesn’t work here. High school students are unlikely to study superconductivity in meaningful depth, if at all and there is no way for respondents to indicate this (the “sturdier” option can be easily discarded via testwiseness). Thus, there are really only two possible solutions. Options D and E don’t work for the “Wrong Answer” row. This question type worked for other items, but it doesn’t belong here.

Screen Shot 2015-08-01 at 1.35.29 AM

Example C: From a survey nominally requesting feedback about the process of grading an exam.

Sin 7: Questions that don’t follow. Above, second question implies a certain response from the preceding multiple-choice question. I think the intention is “if you answered in such a way, then why?” but this is not the prompt that is given.

Sin 8: Technical errors. The second text box appears for a second subject, but if no second subject is entered, the box remains. This can be confusing and distracting.

Screen Shot 2015-08-01 at 2.31.54 AM

Example D: From a survey requesting information for college counselling.

Sin 9: Redundant questions. The above survey effectively asks the same question three times: first, it collects my username; second, it asks for my name; third, it asks for the title of my course. Example B, above, required me to enter personal demographic information twice. Surveys should not be asking for information that can be obtained more reliably elsewhere.

Leave a comment