I’ve just updated my list of social media lessons learned the hard way with details of an online poll which appears to have backfired.
In summary, part of a multi-million pound advertising campaign by Christian charity Alpha International has potentially backfired when an online poll on their website, asking whether people believed in God, showed an abnormally high 98% saying ‘No’ (source: The Register).
It’s not clear whether the skewed results were the result of an automated sting, using software to generate high numbers of repeated responses, or a manual sabotage, with willing individuals logging on and submitting multiple responses. It seems probable that it was one or the other – it’s unrealistic to expect that kind of response from the normal users of such a site, who are more likely to be in the ‘Yes’ or ‘Probably’ camps.
A quick Google search reveals all sorts of online discussion around the poll, one of which explicitly invites fellow members to head over to the poll and “make it look even worse”. Within 2 days, 80 people had replied to that post, one of whom had spotted the technical flaw in the poll:
Ooooo – it lets you vote more than once. Vote early and vote often folks!
Other more productive replies talk about the lack of decent response options, with one person pointing out:
So they have “yes”, “no”, and “probably”… where’s “probably not”?
(to which I’d also add a middle option of “Don’t know”)
Luckily for Alpha International, the completely inaccurate results are obvious to all, potentially lessening the damage done. But, it has to be asked, what was the point of the survey in the first place? What would the ideal response have been for Alpha International?
This is a crucial question when considering online polls. Are we using them for real fact-finding, or are we just trying to prove a point? I recently saw a site which asked for parent’s opinions about local school closures. For me, this seemed a little unwise. A strongly negative response may have adversely affected any attempt by the authority to close schools, whilst a strongly positive response would have seemed rigged. That site did at least employ cookies to limit multiple responses, but with such an emotive subject it wouldn’t be hard for campaigners to direct vast numbers of people to the poll to make their opinions heard.
Quantity vs quality
And there is another risk of online polls, which is the lack of qualitative reinforcement. Quantitative figures can only show us some of a picture – if we are really trying to gage public opinion we should be entering into discussions, teasing out issues which we may not have thought of, and possibly even turning around some of the negative responses by offering valid counter-arguments and supporting information.
The Alpha poll is a perfect example of how over-simplified, under-restricted public polls can seriously backfire. They can be handy for giving people a quick and easy way of starting to engage in a discussion, but only really serve as the start of such a process. Adding forum functionality, to allow people to qualify their response, is an ideal way of taking this further, allowing pollsters to engage and discuss. Without this, polls are at best a fairly worthless set of figures, and at worst a PR nightmare waiting to happen.