Pretty Simple: web, digital, social

Biased results are a risk of online polls

Biased results are a risk of online polls

I’ve just updated my list of social media lessons learned the hard way with details of an online poll which appears to have backfired.

In summary, part of a multi-million pound advertising campaign by Christian charity Alpha International has potentially backfired when an online poll on their website, asking whether people believed in God, showed an abnormally high 98% saying ‘No’ (source: The Register).

Alpha International has suggested that the skewed results are down to an online sting, with Spokesman Mark Elsdon-Dew adding: “I don’t think this is indicative of people’s faith in this country.” This seems highly likely – especially as the poll allows multiple (in fact unlimited) responses on the same computer. Many online polls use cookies, IP logging or such to prevent people from responding to a poll more than once – on each subsequent visit to the poll page, the user would normally be taken straight to the results page instead. Without this, the accuracy of your poll is at risk from repeat responses.

It’s not clear whether the skewed results were the result of an automated sting, using software to generate high numbers of repeated responses, or a manual sabotage, with willing individuals logging on and submitting multiple responses. It seems probable that it was one or the other – it’s unrealistic to expect that kind of response from the normal users of such a site, who are more likely to be in the ‘Yes’ or ‘Probably’ camps.

Organised sabotage?

A quick Google search reveals all sorts of online discussion around the poll, one of which explicitly invites fellow members to head over to the poll and “make it look even worse”. Within 2 days, 80 people had replied to that post, one of whom had spotted the technical flaw in the poll:

Ooooo – it lets you vote more than once. Vote early and vote often folks!

Other more productive replies talk about the lack of decent response options, with one person pointing out:

So they have “yes”, “no”, and “probably”… where’s “probably not”?

(to which I’d also add a middle option of “Don’t know”)

Luckily for Alpha International, the completely inaccurate results are obvious to all, potentially lessening the damage done. But, it has to be asked, what was the point of the survey in the first place? What would the ideal response have been for Alpha International?

This is a crucial question when considering online polls. Are we using them for real fact-finding, or are we just trying to prove a point? I recently saw a site which asked for parent’s opinions about local school closures. For me, this seemed a little unwise. A strongly negative response may have adversely affected any attempt by the authority to close schools, whilst a strongly positive response would have seemed rigged. That site did at least employ cookies to limit multiple responses, but with such an emotive subject it wouldn’t be hard for campaigners to direct vast numbers of people to the poll to make their opinions heard.

Quantity vs quality

And there is another risk of online polls, which is the lack of qualitative reinforcement. Quantitative figures can only show us some of a picture – if we are really trying to gage public opinion we should be entering into discussions, teasing out issues which we may not have thought of, and possibly even turning around some of the negative responses by offering valid counter-arguments and supporting information.

In conclusion

The Alpha poll is a perfect example of how over-simplified, under-restricted public polls can seriously backfire. They can be handy for giving people a quick and easy way of starting to engage in a discussion, but only really serve as the start of such a process. Adding forum functionality, to allow people to qualify their response, is an ideal way of taking this further, allowing pollsters to engage and discuss. Without this, polls are at best a fairly worthless set of figures, and at worst a PR nightmare waiting to happen.

This entry was posted in Blog and tagged , , , . Bookmark the permalink.

4 Responses to Hard lessons in social media: Online polls

  1. Gary Miller says:

    *groan* Used to be a time when all we had to worry about were the statisticians manipulating results; now we have the respondents themselves doing it!

    Nice article James.

  2. Heather says:

    I take it you missed this a few months ago? It was a good laugh!

    Twitterers claim victory over loaded daily Mail gypsy poll

    • James says:

      Brilliant! Another perfect example of the risks of online polls. Thanks Heather. Especially interesting that:

      Angered Twitter users have now vowed to take their campaign to all of the Daily Mail’s online polls

      Which could seriously impact on the Mail’s attempts to ever do a poll again. A hard lesson indeed.

      • James says:

        Further to this, I’ve just spotted that there’s a Twitterer called @Polljack whose profile says:

        Follow us to find out which stupid, loaded, illiberal Daily Mail polls are ripe for skewing. Together, we will ruin the results!

Leave a Reply

Your email address will not be published.

Browse by Category