Social Business Zone is brought to you in partnership with:

Adi is a social business blogger and community manager that writes for sites such as Social Business News and Social Media Today. Away from the computer he enjoys cycling, particularly in the Alpes. Adi is a DZone Zone Leader and has posted 1242 posts at DZone. You can read more from them at their website. View Full User Profile

The role of task difficulty in collective intelligence

10.15.2013
| 1449 views |
  • submit to reddit

Since James Surowiecki his book on the wisdom of crowds back in 2005 the notion that collective intelligence can trump individual intelligence has entered the public consciousness.  It has indelibly led to constructs such as idea markets, where organisations attempt to tap into the crowd for everything from stock picks to business ideas.

A new study has explored whether crowd intelligence works equally as well regardless of the difficulty of the task being attempted.  The researchers devised a number of tasks for the crowd to perform, with the complexity ranging from answering “if all poor people in the world gave you $1 each, how much sustainable monthly income could you derive from the resulting amount?” at one end, to “what is the high temperature in Seoul today?” at the other (Seoul was the home town of the participants).

The 500 participants had to answer 8 of these tasks, with no prior revision allowed beforehand, and methods undertaken to offset the possibility that anyone could look up the answers online, with the data produced from the tasks suggesting none of the participants did.

The results showed that the power of the crowd was most prominent when the questions posed were in the mid-range of difficulty, or in other words, when the task required access to a wide range of specialized information.  These kind of situations saw the crowd do significantly better than the individual.

By contrast, when the tasks were very easy, both individuals and the crowd performed well, so there was little to separate them.  That is arguably to be expected, but of more interest was the performance of the crowd in the hardest task.

Here it emerged that the individuals did better than the crowd, with the data suggesting that participants were generally so clueless as to the answer that they resorted to random guesses, thus providing a very wide spread of answers that even aggregation could not help to converge on the true value.

So, it seems that if you want to truly capitalize on crowd intelligence that you need to ensure the task is neither too difficult, nor too easy.  That isn’t to say that hard tasks cannot be tackled of course, but merely that such tasks need to be managed and reduced down to moderate levels by the crowd themselves.

Original post