Search

Did 4% of Americans Really Drink Bleach Last Year? - Harvard Business Review

abaikans.blogspot.com

One of the most challenging elements of social science research is that researchers must often rely on data that comes from humans — and humans are notoriously unreliable. When the CDC published a report in the summer of 2020 stating that 4% of respondents reported ingesting household chemicals in an attempt to ward off the coronavirus, many people were (understandably) alarmed. Researchers who replicated the study, with the addition of some basic quality control measures to eliminate inaccurate data, got very different results. In this piece, the author discusses how inattentive or mischievous respondents can accidentally or intentionally skew data, as well as what researchers, reporters, and the public at large can do to identify these issues and avoid jumping to conclusions based on misleading information.

Early in the summer of 2020, the Centers for Disease Control and Prevention (CDC) issued a report on unsafe coronavirus prevention practices in the U.S. According to the report, 4% of the 502 respondents stated that they had drunk or gargled diluted bleach in the last month, 4% said the same about soapy water, and 4% said the same about household disinfectant. This quickly inspired a number of alarming headlines. (Reuters, for example, headlined one piece: “Gargling with bleach? Americans misusing disinfectants to prevent coronavirus.”)

This media response was understandable. While 4% may not seem like much, if this study sample was representative of the U.S. population, it would imply that roughly 12 million Americans engaged in these dangerous behaviors — an alarming figure indeed.

But there may be reason to question that conclusion. First of all, the CDC report noted that a survey of just 500 opt-in participants was not necessarily representative of the U.S. population (though the CDC did weight responses to line up with national age, gender, and race demographics, in order to at least somewhat address any disparities). And beyond these sample limitations, a second study (currently undergoing peer review) that aimed to replicate the CDC’s findings with some additional quality control suggests that the data itself could have some serious flaws.

Specifically, this new study from online research platform CloudResearch sought to address two major issues that can threaten data quality: inattentiveness (i.e., respondents that are careless or aren’t paying attention) and mischievousness (i.e., respondents who intentionally lie or mislead researchers). Psychologists who study relatively rare behaviors, such as hard drug use, have long known about these challenges. For example, in one study on drug use from back in 1973, researchers found that when they included a fake drug in the list of drugs they asked people about, 4% of respondents reported taking a drug that didn’t exist, suggesting that the data was likely not totally reliable.

You might have noticed the reoccurring4% figure. It turns out, that might not be a coincidence. Prominent psychiatrist and blogger Scott Siskind coined it the “Lizardman’s Constant” back in 2013, in reference to a widely publicized Public Policy Polling report that 4% of respondents said they believed shape-shifting lizard people were controlling the world. This poll garnered a lot of attention in the media, including headlines like this one: “Conspiracy craze: why 12 million Americans believe alien lizards rule us.” But Siskind and others argue that that 4% is a lot more likely to reflect inattentive and mischievous respondents than a true belief in such an outlandish conspiracy.

As with the lizard people survey, the CDC report generated a lot of publicity. And as with the lizard people survey, these responses could be legitimate — or they could simply be bad data.

To begin to get to the bottom of this, the second study sought to identify whether inattentiveness and/or mischievousness could explain the CDC’s surprising findings. After asking the same questions as the CDC survey, researchers had participants complete a short word association exercise (i.e., circle the unrelated word from a list) to test for attentiveness. These questions were designed to be very easy for anyone with a basic English reading level to get right, as long as they were paying attention. Next, to target mischievous respondents, the researchers asked “reality check” questions: questions with only one reasonable answer, such as “Have you died of a heart attack?” or “Have you ever used the internet?” (The survey was distributed online.)

Finally, as an additional quality control measure, anyone who said “yes” to ingesting household chemicals was asked a series of follow-up questions: They had to confirm that they had intentionally selected “yes,” and then provide some additional details about the context in which they ingested the chemicals.

So, what did the researchers find? They collected data from a total of 688 participants. Of these, 55 (8%) stated that they had ingested at least one of three household cleaning chemicals (disinfectant, soap, or bleach) — a similar result as reported by the CDC. But of those 55, only 12 passed the basic quality control questions. In other words, almost 80% (43 out of 55) of respondents who claimed to have ingested a toxic chemical did not accurately identify simple related words, and/or gave completely implausible answers to the reality check questions.

That left 12 apparent chemical drinkers who passed the quality control questions. Did they really drink bleach? If so, it would only be 1.7% of the sample, but that would still represent millions of Americans.

When asked to confirm whether they had in fact ingested household chemicals, 11 of the 12 stated that they had selected the “yes” option by mistake. And the one remaining participant — who verified that they had intentionally selected “yes” — responded to the question asking for more detail with “Yxgyvuguhih.” They also reported being 20 years old, having 4 children, weighing 1,900 pounds, and having a height of “100.” (They did not provide units, but neither inches nor centimeters are very plausible). Needless to say, these responses call into question the validity of the final participant’s data.

So how many Americans actually ingested bleach to ward off the coronavirus? We don’t really know. We would need more research to reliably answer that question, but the fact that the percentage dropped from 4% to 0% after accounting for basic data quality issues suggests that the real number is most likely a lot lower than headlines would suggest.

News coverage of the CDC’s report on Americans ingesting household chemicals

To be clear, the takeaway here isn’t that all survey data is garbage. But especially when that data is used to support claims with serious societal repercussions, it’s essential to validate results with basic quality control interventions, such as the attention and reality checks described above. In the case of the CDC study, a failure to do so (along with some perhaps overzealous reporting) led researchers, media, and the public to believe that up to 12 million Americans were drinking bleach. This claim was likely not only false, but also potentially harmful, as it may have served to normalize these dangerous behaviors and thus increase the number of people who might actually engage in them.

Just like chemists and physicists need to ensure that their measurement tools are well calibrated, social scientists must also ensure the quality of their data in order to avoid reaching misleading conclusions. While there are no surefire solutions, a bit of rudimentary quality control can go a long way in validating the accuracy and reliability self-reported data. At the same time, some of the responsibility falls on journalists as well, to fact-check and accurately report on studies — and to avoid sensationalizing. And of course, as with any content, the final gatekeeper is you, the reader. The next time you come across something that sounds too outrageous to be true, know that your instinct may be correct. Ultimately, it’s up to all of us to think critically, do our research, and determine for ourselves whether a source can be trusted.

Let's block ads! (Why?)



"really" - Google News
April 20, 2021 at 07:12PM
https://ift.tt/3dvYnHg

Did 4% of Americans Really Drink Bleach Last Year? - Harvard Business Review
"really" - Google News
https://ift.tt/3b3YJ3H
https://ift.tt/35qAk7d

Bagikan Berita Ini

0 Response to "Did 4% of Americans Really Drink Bleach Last Year? - Harvard Business Review"

Post a Comment

Powered by Blogger.