We've updated our
Privacy Policy effective December 15. Please read our updated Privacy Policy and tap

学习指南 > Mathematics for the Liberal Arts

How to Mess Things Up Before You Start

There are number of ways that a study can be ruined before you even start collecting data. The first we have already explored – sampling or selection bias, which is when the sample is not representative of the population. One example of this is voluntary response bias, which is bias introduced by only collecting data from those who volunteer to participate. This is not the only potential source of bias.
Sources of bias
Sampling bias – when the sample is not representative of the population Voluntary response bias – the sampling bias that often occurs when the sample is volunteers Self-interest study – bias that can occur when the researchers have an interest in the outcome Response bias – when the responder gives inaccurate responses for any reason Perceived lack of anonymity – when the responder fears giving an honest answer might negatively affect them Loaded questions – when the question wording influences the responses Non-response bias – when people refusing to participate in the study can influence the validity of the outcome

Example 14

Consider a recent study which found that chewing gum may raise math grades in teenagers[footnote]Reuters. http://news.yahoo.com/s/nm/20090423/od_uk_nm/oukoe_uk_gum_learning. Retrieved 4/27/09[/footnote]. This study was conducted by the Wrigley Science Institute, a branch of the Wrigley chewing gum company. This is an example of a self-interest study; one in which the researches have a vested interest in the outcome of the study. While this does not necessarily ensure that the study was biased, it certainly suggests that we should subject the study to extra scrutiny.

Example 15

A survey asks people “when was the last time you visited your doctor?” This might suffer from response bias, since many people might not remember exactly when they last saw a doctor and give inaccurate responses.
Sources of response bias may be innocent, such as bad memory, or as intentional as pressuring by the pollster. Consider, for example, how many voting initiative petitions people sign without even reading them.

Example 16

A survey asks participants a question about their interactions with members of other races. Here, a perceived lack of anonymity could influence the outcome. The respondent might not want to be perceived as racist even if they are, and give an untruthful answer.

Example 17

An employer puts out a survey asking their employees if they have a drug abuse problem and need treatment help. Here, answering truthfully might have consequences; responses might not be accurate if the employees do not feel their responses are anonymous or fear retribution from their employer.

Example 18

A survey asks “do you support funding research of alternative energy sources to reduce our reliance on high-polluting fossil fuels?” This is an example of a loaded or leading question – questions whose wording leads the respondent towards an answer.
Loaded questions can occur intentionally by pollsters with an agenda, or accidentally through poor question wording. Also a concern is question order, where the order of questions changes the results. A psychology researcher provides an example[footnote]Swartz, Norbert. http://www.umich.edu/~newsinfo/MT/01/Fal01/mt6f01.html. Retrieved 3/31/2009[/footnote]:
“My favorite finding is this: we did a study where we asked students, 'How satisfied are you with your life? How often do you have a date?' The two answers were not statistically related - you would conclude that there is no relationship between dating frequency and life satisfaction. But when we reversed the order and asked, 'How often do you have a date? How satisfied are you with your life?' the statistical relationship was a strong one. You would now conclude that there is nothing as important in a student's life as dating frequency.”

Example 19

A telephone poll asks the question “Do you often have time to relax and read a book?”, and 50% of the people called refused to answer the survey. It is unlikely that the results will be representative of the entire population. This is an example of non-response bias, introduced by people refusing to participate in a study or dropping out of an experiment. When people refuse to participate, we can no longer be so certain that our sample is representative of the population.

Try it Now 5

In each situation, identify a potential source of bias a. A survey asks how many sexual partners a person has had in the last year b. A radio station asks readers to phone in their choice in a daily poll. c. A substitute teacher wants to know how students in the class did on their last test. The teacher asks the 10 students sitting in the front row to state their latest test score. d. High school students are asked if they have consumed alcohol in the last two weeks. e. The Beef Council releases a study stating that consuming red meat poses little cardiovascular risk. f. A poll asks “Do you support a new transportation tax, or would you prefer to see our public transportation system fall apart?”

Licenses & Attributions

CC licensed content, Shared previously

  • Math in Society. Authored by: Open Textbook Store, Transition Math Project, and the Open Course Library. Located at: http://www.opentextbookstore.com/mathinsociety/. License: CC BY-SA: Attribution-ShareAlike.