No ads? Contribute with BitCoins: 16hQid2ddoCwHDWN9NdSnARAfdXc2Shnoa
Jul 152011

In a classic example of a deceptive news story, the BBC announced today in their TV news bulletin that 8 European banks failed the stress test – meaning they might not survive a financial crisis.

But later listening to Euronews (and in fact in the online article by the BBC), I hear a slightly different slant to the story – 8 our of 90 banks tested failed. Or 9%. Or to put it another way, 91% of European banks passed the stress test.

Merely announcing that 8 failed the test does not give an indication of the scale of the problem – was it 8 out of 8 ? 8 out of 80 ? Or 8 out of 800? Given the current climate in the wake of the recent banking crisis it is not unreasonable to assume that 8 failing the test is a much more serious problem than it really is.

Is it so much to ask that the media actually spend some time thinking about statistics and giving a proper slant to the news they announce ? Saying 9% failed the test is just as quick as saying 8 banks failed, and gives us more information, and the objection that some people may not understand percentages is pretty bogus – those who do not understand them are unlikely to be bothered by the “8 banks failed test” statement anyway.

A case of over simplification leading to unintended (or was it intentional?) deception.

Feb 042010

It always good to see statisticians give a good hard kick to those who put the word “lies” into the saying “lies, damm lies, and statistics” … the politicians. In this particular case the Conservative Shadow Home Secretary Chris Grayling has been making comparisons between violent crime statistics from the 1990s and the year 2008/9 using the police recorded crimes statistics. The UK Statistics Authority has said (unfortunately it is a PDF document) something more or less along the lines of “you can’t do that” (in an astonished and shocked tone of voice).

According to the UK Statistics Authority, the method for recording crime statistics in police stations was standardised in 2002/3 leading to a marked increase in recorded crime that year due to the change. Indeed they point out that all published statistics on police recorded crime clearly emphasise the fact that the figures cannot be naively compared with values before 2002/3. The statisticians claim that crime figures should be obtained from the British Crime Survey.

The UK Statistics Authority is worried about politicians using statistics to mislead the public and discredit official statistics.

How does Chris respond ? Basically by saying that he doesn’t believe the BCS and that the increase in reported crimes are too big to be explained away by changes in the recording method.

Who would I rather believe ? Who would you rather believe ? A politician ? Or a statistician ? No contest really; statisticians may not have the best reputation, but at least they do not inspire the same level of disgust as a paedophile like politicians do.

The interesting thing is that people believe that violent crime has gotten worse over the last decade. As to why they believe this I don’t know, because from personal experience I can tell you that violent crime has decreased dramatically over the last decade. Back in the day, I used to be off down town most Saturday nights (and often Fridays too), and almost every night out there would be some sort of fracas varying from a bit of a scuffle in a pub, to an all out street brawl with police helmets flying. These days ? I tend to stay home a great deal more, and there is almost no violence that I can see around.

The whole reason for statistical surveys is to go beyond personal experience and belief, to get much closer to the truth. And when you have that statistical survey you do not throw it away because you do not like the results. You have to change your beliefs. Ordinary people can be forgiven for not doing so, but a politician in the position of Shadow Home Secretary has a responsibility to do his or her best for the country.

Let us examine the “lie” accusation a little closer. Using statistical data in an inappropriate manner such as comparing reported crime figures whose recording methodology was different, is just as much a lie as a school child yelling out “You smell”. It also helps to discredit statistics as a whole, because the public is given the impression that one set of statistics says one thing and another says another – which is not the case at all.

Hard for a Tory whose lies are told in the undoubtedly unselfish goal of removing the present Labour government.

Nov 122008

So this morning I am sitting in front of the TV with my caffeine fix and some news channel on to break up the silence. On comes this item about how a school has shown a dramatic improvement after having introduced a new disciplinary regime. Something like an increase of 25% in pupils getting 5 or more GCSEs (cannot recall the exact numbers and it does not matter anyway).

But wait! The reporter goes on to say this increase does not include English or Maths and when those subjects are included, the increase is not quite as dramatic. So English and Maths are unimportant subjects are they ? Or perhaps the story does not come across as so interesting if the real increase is given.

So this reporter has stretched the truth (i.e. lied) by reporting a meaningless statistic that sounds good rather than a proper set of figures which would still sound good to those who do not have unrealistic expectations.

Why does the media do this ? Well obviously to make things sound better than the really are or more usually worse than they usually are. That is fair enough on an entertainment show, but surely news should present the facts and not try to stretch the truth.

Apr 182008

I recently discovered one of the most entertaining web reads I’ve come across for ages … Bad Science which is a site dedicated to pointing out where the (mostly media) uses “Bad Science” or falls victim to “Bad Science”. The author (Ben Goldacre) is a medical doctor so most of the criticisms are in relation to medicine rather than science in general. But the debunking of rubbish media reports on (mostly) medical issues is worthwhile and done in an entertaining way.

It is interesting that many of the more foolish reports in the media have to do with bad statistics rather than bad science itself. That is I suppose not too surprising, as statistics seems to be widely misunderstood.

I have the advantage that many years ago I spent some time studying statistics, and many media reports have the effect of making my inner statistician jump up and down in fury shouting “Bullshit” over and over again. Fortunately he doesn’t shout too loudly or I’d run the risk of being shut away in a room with nice soft walls.

Statistics don’t lie, but they don’t always say what we think they do

I’m going to make use of an example relating to cannabis and an article publish by that paragon of excellent and accurate reporting, the Daily Mail. The article itself is here … scary isn’t it?

A report that has statistics that says that people who smoke cannabis have a 41% higher risk of schizophrenia, indicates that cannabis smokers are more likely to have schizophrenia than the general population. That doesn’t mean that cannabis causes mental health issues; that is an untested hypothesis. A quick uneducated guess at a number of possible reasons why includes :-

  • Cannabis use increases the risk of mental health problems (yes it is possible).
  • People with mental health problems are more likely to use cannabis than others.
  • Cannabis use makes existing mental health problems worse.
  • There is no link between cannabis use and mental health problems; the correlation is accidental.
  • The study that found a correlation between cannabis use and mental health problems is flawed and there is in fact no such correlation.

One of the biggest mistakes anyone can make with statistics is to take a link between two variables (a correlation) and assume that one variable causes another (cannabis use causes mental health issues). This is known as “Correlation does not imply causation”; stealing a Wikipedia example, there is a correlation between going to bed with shoes on and waking up with a headache. Sleeping with shoes on does not cause headaches, but drinking copious quantities of alcohol makes it more likely that you will sleep with your shoes on, and far more likely you will wake with a headache.

Facebook Auto Publish Powered By :

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.