I’ve been reading about Australia’s Best Blog Competition – the winner and runners-up were announced today.
The Singing Bridges Travel Diary won. Interesting site.
More interesting to me, however, is that the organisers – SmartyHost, who used the competition to market their blog hosting services – have come under some fire for their judging system. According to a journalist at The Age, who was on the panel, the various judges seem to be marking the entries on different criteria, and no standardization of the scores was attempted. Apparently the winning entry was voted first by only two of the judges, and
“It took the prize only because one of the judges gave it a total of 40 points – by far the most generous award by any judge – and the other 35 points.”
The judges (the report does not state how many there were) only saw the 11 finalist entries. It sounds like not a lot of thought went into the exact scoring process.
This is something that we were very conscious of when organising the WA Web Awards. The WAWAs were organised by the committee of Port 80, and naturally almost all of the committee members, as active participants in their local industry, wanted to enter their own client sites. In order to maintain legitimacy and avoid any hint of bias, a very strict judging process had to be set up.
Firstly, one committee member was designated the judging chairperson. That person was not permitted to enter any sites themselves. Our judging chairperson was the only individual dealing with the judges and entry process. Of course, that meant a lot of work for that one person – Megs did an awesome job – but ensured that no one could even accidentally unfairly influence the judges. We’ll get you an assistant next year, Megs!
Secondly, a very detailed set of judging criteria were developed, customised for each of the 14 categories, with guidelines for what constituted good and bad scores for each. The four judges all commented on the fairness and quality of the criteria – which was very reassuring. Our tireless judging chairperson averaged the scores and determined the winners using a mathematical process – and in fact she was the only person who knew the winners ahead of time, even the judges did not know (although they could probably have made some educated guesses).
In the end each of the winning sites was very deserving (especially the ecommerce category, smirk) and there was no sniping or general disagreement with the results that I heard about. So I guess our system was successful.
I’ll be interested to see how the Best Blog Competition evolves… presumably they will learn from their mistakes and take it to bigger and better things in the future. For me, it’s reinforced just how important a good judging process is to the perceived legitimacy of a competition – the last thing anyone wants is to receive commendation in a competition that then has it’s validity questioned.