Every day you can find a news story citing a poll -- how feel about politicians, or soda, or TV, or drinking ... you can find stories about polls that detail "America's" attitude toward sex, love, marriage, eating, drinking ...
And every one of these stories uses the poll to make a conclusion -- approval ratings, election picks -- and then that conclusion is presented to us, the public, as if it were truth.
Problem is, polls are pure crap.
Tonight's example: here's a story about a poll that tell us that 1 in 4 Americans did not read a book last year. It also details who reads what and how often.
This is a nationwide poll, that is supposed to represent 302,666,681 Americans (July 2007 estimate by the U.S. Census Bureau). Stop, and read that number again. Three hundred and two million people ... and this poll is based on the responses of 1,003 people.
The population of Connecticut was 3,510,297 in 2005 ... for God's sake, the population in West Haven was 52,721!
I don't care how many formulas or algorithms you use, you simply cannot judge what a million people think by asking only a thousand of them, let alone the numbers these polls really use.
1,003 = 302,666,681 ??
I'm not even going to go into all the "conclusions" this particular poll found, because it would drive me mad and I'd end up swearing, which I'm told I can't do here -- but I don't think you can draw any conclusions about anything by asking a thousand people, other than what those thousand people think.
If each state was represented evenly, which already would skew the results, each would have about what, 20 people, representing the whole state?
Just think -- in your daily travels, how many total morons do you see? With a sample rate of only 1,003 ... hell, 5 morons from each state could throw the numbers way off.
And you're supposed to consider that kind of information valid?
It just gets under my skin, is all, because then these polls are put in newspapers as if they're some kind of true statements. As far as I'm concerned, polls belong on the Op-Ed page.
2 comments:
Well, it's not quite correct that "you simply cannot judge what a million people think by asking only a thousand of them" - you can, but it's a question of with what degree of confidence; the article does conclude with "It had a margin of sampling error of plus or minus 3 percentage points." And, those thousand (or how ever many) respondents must have been chosen to be as representative of the population as possible. For more on the methodology stuff, see for example A Brief Introduction to Sampling.
Hi Stefan. Thanks for the comment.
I checked out the site you mentioned, and it was basically most of the stuff I learned in my surveys and statistics work at Southern.
I think even if pollsters can get a sense of what the majority might think about a given subject (say, an election poll) and make a prediction based on that, a poll that seeks to determine certain behavioral habits needs a really large sample size -- I simply can't believe that 1,003 people can be representative of all the groups the poll claimed to contain representation of: race, religion, political affiliation, income, education -- and all possible combinations. That's just too many categories to be represented by only 1,003 people.
When we used to do polls, we would use a 2,500 to 3,000 sample size for a statewide poll, and we considered that to be the bare minimum.
I guess what it all boils down to, to me, is that you can do the poll, and you can crunch numbers and get mathmatical results, but that doesn't make them credible.
I don't necessarily have a problem with polls themselves; I have a beef with the way that information is used/presented.
Post a Comment