Thursday, February 3, 2011

Cycling and Statistics

As someone with a background in the social and natural sciences, I was "raised" on statistics by the academic system. If we compare academia to religion, then making claims without statistical evidence is akin to taking the Lord's name in vain. But even beyond academia, we have an inherent faith in statistics as a culture. We respect numbers and charts, and we turn to them for comfort at times of uncertainty. Consider, for instance, this beautiful bar graph:

Now, some of us may have suspected that diamond frames tend to be ridden by men, whereas step-throughs and mixtes tend to be ridden by women, but only numbers and graphs have the power to lift us from the murky waters of speculation. We can now say that, in a recent poll conducted by Lovely Bicycle, of the 221 respondents who claimed to ride mainly diamond frame bikes for transportation, 76% were male. Of the 95 respondents who claimed to ride mainly step-through bikes for transportation, 80% were female. And of the 39 respondents who claimed to ride mainly mixte bikes for transportation, 66% were female. This numerical evidence we can wield like a mighty weapon the next time someone contradicts these tendencies.

Of course the one little problem with Statistics, is that it's mostly BS. In the words of comedian Vic Reeves, "88.2% of Statistics are made up on the spot" - which may very well be the case. But numbers need not be maliciously forged in order to misrepresent reality. There are multitudes of ways in which a study can be flawed or biased from the start, set up so as to elicit particular responses. Often this is done unintentionally, or at least unconsciously, by researchers eager to find evidence for their pet theories. Other scenarios can include how data is processed, or even how the final results are presented. Statistics are highly prone to human error and bias, which means that they are inherently subjective. This, combined with the fact that we respect them so much, makes our statistics-loving culture susceptible to misinformation.

[image via NHTSA]

The idea of statistics and misinformation brings me to what I really wanted to talk about here, and this is something I've been trying to make sense of for a while. I am puzzled by the use of safety statistics in bicycle advocacy, and I am hoping that someone could explain them to me. For example, many bicycle advocacy talks and internet presentations stress that it is safer to ride a bike than it is to travel in a car. In support of this, they use statistics such as this data from the National Highway Traffic Safety Administration (NHTSA), according to which there were about 30,000 motor vehicle traffic fatalities, and about 600 bicycle fatalities in the USA in 2009. These numbers are used by cycling advocates to point out how much safer it is to cycle than to drive. But unless I am missing something, the figures mean just the opposite.

Yes, the NHTSA numbers suggest that in 2009 there were 50 times more motor vehicle fatalities than there were bicycle fatalities in the US. But those numbers mean nothing until they are weighed against how many cars vs bicycles there are on the roads at large. If there were 50 times more cars on the road than bicycles, than the risks of fatal traffic accident would be equal for each mode of transportation. But I believe that in actuality, there are more like 1,000 more cars on the roads than bicycles... which means that the number of cycling fatalities is disproportionately high.

Obviously, I am not trying to prove that cycling is unsafe. But I do want to understand the reality of the situation. After all, if cycling advocates use statistics incorrectly, they open themselves up to some very harsh critique from unfriendly forces. Where could one go to obtain accurate statistics about the number of cars vs bicycles on the roads, and the number of traffic accidents for each?

0 comments:

Post a Comment