Actions

How to design a good survey (guide)/es: Difference between revisions

From LimeSurvey Manual

(Created page with "<syntaxhighlight lang="php" enclose="div"> "¿Qué marcas de chocolate conoces?"")
(Created page with "(seguido de un cuadro de texto vacío) </syntaxhighlight>")
Line 83: Line 83:
<syntaxhighlight lang="php" enclose="div"> "¿Qué marcas de chocolate conoces?"
<syntaxhighlight lang="php" enclose="div"> "¿Qué marcas de chocolate conoces?"


(followed by an empty text box) </syntaxhighlight>
(seguido de un cuadro de texto vacío) </syntaxhighlight>


The corresponding example for an aided question would be:
The corresponding example for an aided question would be:

Revision as of 19:57, 18 November 2019

'Una guía simple para hacer una buena encuesta' .

LimeSurvey facilita la preparación de una encuesta muy rápidamente. Desafortunadamente, eso hace que armar una mala encuesta sea muy fácil también.

En esta página, encontrará una guía simple para elaborar una encuesta que no solo será fácil de responder para su audiencia, sino que también le dará respuestas significativas.

Este blog también proporciona otra lista útil de "Consejos y trucos de diseño de encuestas"

Antes de hacer una encuesta.

Algunas preguntas importantes deben responderse antes de diseñar cualquier cuestionario, incluso antes de decidir si un cuestionario es el camino correcto.

¿Qué es lo que realmente quieres descubrir con tu investigación?

Basado en eso:

  • ¿Una encuesta me ayudará con este problema?
  • ¿Cómo puede ayudarme la encuesta a responder el problema?
  • ¿Quiénes son las personas adecuadas para preguntar?
  • ¿Cómo puede comunicarse con ellos?
  • ¿Qué se necesita para que entiendan sus preguntas?
  • ¿Cuáles, si los hay, son los métodos estadísticos que desea / necesita utilizar en los datos?

Estos son solo algunos de los problemas que debe tener claros para decidir si LimeSurvey es la herramienta adecuada para usted. LimeSurvey es ideal para cuestionarios completamente estructurados (usted sabe todas las preguntas que podría tener que hacer antes de comenzar la entrevista), estandarizados (todos obtienen más o menos el mismo cuestionario), en su mayoría cuantitativos (se trata principalmente de números o preguntas con respuestas predefinidas) los cuales son recopilados en línea.

Hasta cierto punto, por supuesto, puede variar de esto: puede usar LimeSurvey para recopilar respuestas a algunos tipos de entrevistas telefónicas. Puede usar LimeSurvey para recopilar datos cualitativos, por ejemplo, mediante preguntas de texto.

Pero en algún momento puede llegar a la conclusión de que otros métodos de investigación son más adecuados.

Estructurar un cuestionario.

Para decidir en qué orden hacer las preguntas y cómo agruparlas, hay algunos aspectos a considerar, por ejemplo

Si es posible, comience con preguntas que sean fáciles de responder y que todos los participantes se sientan cómodos respondiendo. A menudo, estas serán preguntas de alguna manera selectivas, es decir, preguntas que debe hacer para averiguar si está preguntando a las personas adecuadas (use condiciones y/o cuotas para tratar estas preguntas de detección)

Poner preguntas "para romper el hielo" al principio podría ayudar a evitar que los participantes abandonen su encuesta, ya que es menos probable que las personas abandonen una vez que ya hayan trabajado un poco para contestar el cuestionario.

 ¿Cuál de las siguientes frutas te gusta?
#Manzanas ()
#Bananas ()
#Cerezas ()

(opción única)

Puede usar Condiciones, para que aparezca la siguiente pregunta sobre cerezas, si el participante elige responder "cerezas".

¿Por qué prefieres las cerezas?
#Son sabrosas
#Me encanta el color
#Son saludables
#Son jugosas
#Me encanta el pastel de cereza.

(pregunta de opción múltiple (o opción única, depende si necesita datos exactos))

¿Cuánto te gustan las cerezas?
#1) ¡No mucho más que otras frutas
#2) ¡Me gustan mucho más que otras frutas
#3) ¡Es una de mis frutas favoritas
#4) ADORO LAS CEREZAS!

(opción única)

¿Conoces alguna receta relacionada con las cerezas?

[campo de texto]

Arriba hay un ejemplo de preguntas fáciles y atractivas seguidas a la pregunta principal.

El objetivo era reunir recetas relacionadas con cerezas, manzanas y plátanos.

Por otra parte.

Si necesita hacer preguntas problemáticas, usted puede pensar en usar diferentes páginas (una página por grupo o una página por pregunta) y poner estas preguntas al final. Con esto, si los participantes deciden interrumpir, al menos se guardan sus respuestas anteriores.

Otro aspecto para la estructura es evitar sesgos introducidos por el cuestionario mismo.

Por ejemplo, en la investigación de mercado hay conceptos que requieren preguntas sin ayuda y con ayuda. Un ejemplo para una pregunta sin ayuda sería:

 "¿Qué marcas de chocolate conoces?"

(seguido de un cuadro de texto vacío)

The corresponding example for an aided question would be:

 "Which of the following brands of chocolate do you know?"

(followed by a list of brands (multiple choice))

If you would like to include both into the same questionnaire, you should again make sure to put them on different pages and put the unaided version before the aided on - otherwise you will basically teach or at least actively remind participants about existing brands which will invalidate your results for a following unaided question.

Individual questions

Questions should be non-suggestive. So "What is your opinion about LimeSurvey?" is an acceptable question while "Don't you agree that LimeSurvey is a really great tool?" is most likely not.

Example:

People may say "yes" to donate money if you ask the questions this way

  • Do you love nature?
  • Will you donate money to help the river?

But probably will say "no" if you ask the questions this way:

  • Is lack of money a problem for you?
  • Will you donate money to help the river?

To avoid this kind of thing, try to have your questions go:

  • from the least sensitive to the most sensitive
  • from the more general to the more specific
  • from questions about facts to questions about opinions

Also, a survey question can be:

  • Open-ended (the person can answer in any way they want), or
  • Closed-ended (the person chooses from one of several options)

Closed ended questions are much easier to total up later on, but may stop people giving an answer they really want.

Example: "What is your favorite color?"

Open-ended: Someone may answer "dark fuchsia", in which case you will need to have a category "dark fuchsia" in your results.

Closed-ended: With a choice of only 12 colors your work will be easier, but they may not be able to pick their exact favorite color.

Look at each of your questions and decide if they should be open-ended or closed ended (take the opportunity to rewrite any questions, too)

Example: "What do you think is the best way to clean up the river?"

Make it Open-ended: the answers won't be easy to put in a table or graph, but you may get some good ideas, and there may be some good quotes for your report.

Example: "How often do you visit the river?"

Make it Closed-ended with the following options:

  • Nearly every day
  • At least 5 times a year
  • 1 to 4 times a year
  • Almost never

You will be able to present this data in a neat bar graph.

When working with multiple choices or single choice questions make sure to choose the right one and formulate both question and answers appropriately.

For example:

Which of the following fruit do you like?
#Apples   ()
#Bananas  ()
#Cherries ()

is a typical multiple choice question, as you can like several of those - in contrary "Which of the following fruit do you prefer?" is a single choice question.

Both fruit examples have been formulated to make clear that it is only about the fruit listed as answers. If your question would be "Which is you favorite fruit?" you should either have a really exhaustive list of fruit or - more likely, use LimeSurvey's setting to add an "other" field. More generally, answer options in most cases need to be complete, mutually exclusive and definite.

If you have multiple or single choice questions with a lot of options for answers, you need to be aware that this might introduce another bias, as participants are likely to pay more attention to the very first options than to those in the middle. LimeSurvey offers a nice option to randomize the order of questions and to some extend diminish this problem.

What makes a good survey?

There are 3 features of a survey which might make it useful:

  1. The questions are clear and precise, collectively allowing for detailed, unambiguous and meaningful answers.
  2. All predefined answers provided and their formats are appropriate to the question.
  3. There is room for people to add additional information if they need.

Adding to that, always keep the user experience in mind. Reading, scrolling and clicking are tiring activities, so:

  1. Avoid any unnecessary question
  2. Use conditions to avoid asking questions not relevant for a specific participant
  3. Keep questions and answers short and easily readable - use appropriate markup
  4. Think about the trade-off between scrolling and clicking. Display everything on one page for short questionnaires (5-15 questions, depending on question complexity). Use groups wisely for longer questionnaires, i.e. group questions comprehensibly; use group descriptions to give a clear statement about the topic of the following questions.
  5. Avoid confusing participants with different scales, i.e. limit the amount of different scales types, scale scopes and different scales descriptions as much as possible. Try not to change the direction of scales. (There are some methodological exceptions).
  6. For rating scales it might be useful to use an even number of rating options so the user has to decide for a certain direction (see below).
Example for good answer scales:

1. Very good

2. Good

3. Quite good

4. Not that good

5. Bad

6. Very bad

Example for bad answer scales:

1. Good

2. Fair

3. Bad

The best way to start designing a survey is to take a second to imagine your ideal feedback. It goes without saying that meaningful responses are the most useful ones, so try to create questions which invite these answers.

How can you go about that? The best method is to split up all the areas and decide on what information you need.

For example, imagine you held an event, open to the public, and needed to get some general feedback about the event.

A "bad" survey might be similar to the following:

Did you enjoy the Event?

( ) Yes

( ) No

How good was the wifi?

1 2 3 4 5 6 7 8 9 10

( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( )

Did you arrive OK?

( ) Yes

( ) No

Was the provided map OK?

( ) Yes

( ) No

Were how did you feel about the mixture of speakers?

( ) Very sad ( ) Sad( ) Neutral ( ) Happy ( ) Very Happy

Matrix questions.

That may be a bit of a generalisation, but its a good rule to live by. Anything that involves a scale of any sort should be avoided unless you want an answer that can be measured on a scale, like age, time (maybe) or quantities. Similarly, a matrix of compulsory questions can be a bit of a deterrent for your audience as not only are they badly structured, but they don't allow for any extra information.

Chances are if someone is completing your survey they want to give feedback, and if they don't think they can share anything useful they'll just close the window and move on.

So what's wrong with the survey above?

Let's look at each question step by step.

Question 1 doesn't really achieve anything. Imagine receiving 100 "no" responses. Firstly you'd be sad, but secondly you would have absolutely nothing to do with the information. We'll look at a possible improvement to this in a moment.

Question 2 is so bad it's painful. If we go back to our 3 steps, we see that the questions need to be clear and precise. I'm not an expert in Wifi but I'm fairly certain you don't measure it in "goods". What's more, it doesn't allow for a meaningful answer – What will you do with the knowledge that while 33% of people rated your wifi as being good, only 23% rated it as good? Point number 2 tells us that the predefined answers need to be appropriate to the question.

It's fairly obvious that a scale of 1 to 10 won't help you improve on the quality of your wifi. The real killer in this one is the need to have space for people to add additional information. How could someone report a specific problem?

In this case? It's impossible. Surveys are all about finding out something you can work with or learn from.

The next two questions have the same result – they provide a platform for people to say yes or no. And neither of those questions allow for detail. In a better survey, which we'll consider in a moment, you could learn particular problems from specific people.

The final question is another painful one. Asking for the level of satisfaction about something is pretty much useless, as every person has different interests, and so everyone will likely have different opinions on each speaker. It's another example of where a range question is being used and shouldn't be.

Have a look at the improved survey there.

Did you make use of the in-house wifi?

( ) Yes

( ) No

Did you experience any problems?

( ) No problems at all

( ) A couple of small issues, nothing major

( ) One or two severe issues

( ) The wifi was nearly totally unusable.

If you experienced problems, could you briefly describe them? (Text field)

Did you arrive ok?

( ) Yes

( ) No

How did you come to our event?

( ) Train

( ) Car

( ) Bus

( ) Light Rail (IE Tube, Tram)

( ) Walk

Did you use the map from our website?

( ) Yes

( ) No

If you looked at the map, was it detailed enough?

( ) Yes

( ) It gave me a rough idea, I used another map for more detail though

( ) Not detailed at all.

If you didn't use the map, why not?

( ) Not enough detail

( ) I used a satnav/google maps instead

( ) I didn't even know it existed!

Generally speaking, were the speakers interesting? Did you enjoy the presentations?

( ) Nearly all were interesting and thoroughly enjoyable

( ) Mainly enjoyable, but a handful of less interesting talks.

( ) A 50 &ndash; 50 split

( ) More dull than interesting

( ) Didn't find any interesting.

Please elaborate on your answer to the question above. Feel free to reference any

particular people/talks. (Text field)

If we could improve the range of talks, or you have any other interesting ideas regarding

the talks, list them below (Text field)

If you have any other recommendations or ideas, please list them below. (Text field)

This survey may be a little longer, but it's a lot easier to both answer and interpret the answers from. Asking two or three questions about each of the topics means that when it comes to processing the results you can do a little more analysis. For example. Imagine, in response to the first survey you received 30 people saying they didn't arrive OK.

This would have been as much information as you could possibly extract from the results, but with our new set of answers it would be possible to deduce which forms of transport presented people with trouble. What's more, you could go on to see whether they were using a provided map or another form, and use this to target improvements for the future.

Keep in mind, that after 50 questions, the user is most likely to not read the question anymore.

Other important additions are the text field questions – these allow your audience to give specific feedback which you can draw from. It's a good idea to not make these compulsory though, as they may put people off answering.

To conclude, when writing a survey you should aim to create one which asks specific questions targeted to the issues you need to answer. Also remember that it doesn't do any harm to gather a little extra background information as you can use it all to get better information from the result.

It's also important to phrase your questions well, as if people need to answer the questions and they don't understand them, they'll close the window and move on. Try to hand your survey to someone else to proof read before making it publicly available to ensure its definitely crystal clear what the question is asking.

Survey Bias

In obtaining market research, one of the most critical things to avoid is the issue of survey bias - that is, asking the respondent questions which may influence the results he or she provides and thus invalidates or skews your data.  It is very easy and common for a company or individual without the proper market research education and experience to err in this way.  This extends to many things, such as the way a question is phrased, to the types of responses which are available, to the way that an interviewer presents the questions if data is being collected by phone or in-person.

For example, a biased survey question may be written as such:

How much did you enjoy the event?

( )Very much

( )A little bit

( )Not very much

( )Not at all

At first glance, it appears that there is no problem with the structure of this question.  After all, the respondent is able to provide his or her answer from "very much" to "not at all".  However, the problem appears in the question phrasing itself.  By asking the respondent "How much" he or she enjoyed the event, the interview has already established a bias by assuming that the respondent enjoyed the event in some way or another, which may not be the case.

The following example would be a far better what to ask the question which does not lead or bias the respondent in any way:

How would you rate your overall enjoyment of the event on a 1 to 5 scale, where 1 is "Not at all enjoyed" and 5 is "Completely enjoyed"

1 2 3 4 5

( ) ( ) ( ) ( ) ( )

Using this modified question, the respondent is not led to the assumption that any enjoyment was had, and is able to supply an answer in a numerical way that makes it easy for him or her to specify the enjoyment level, and makes it easy for the survey solicitor to tabulate and compare the results with other respondents.

This is just one minor example of a good survey design principle.  This understanding extends to all things.