OK, I know this story is from Finland, and it’s about a federal legislation thing which means it’s not exactly a Rip-Off-And-Duplicate for a lot of you. But one of the things that I pound on in the book is the need to CrowdSource Wisdom from the public — we need to engage them deeply and meaningfully, in not only telling us whether they like or don’t like something, but in partnering with us to actually make intelligent decisions and enable good things to happen. The benefits are exhaustive — from building a base of support, to redirecting disruptive debate, to -gasp- maybe even making better decisions. I am pretty well convinced that most of the dysfunction and wasted time and money we see in our public meetings, our planning, our “community engagement” efforts, etc., would largely go away if we would remake our public engagement processes.
Hence the importance of this exercise, which was tried on a new proposal for off-road traffic. The goal, as the authors say, was to test whether this kind of law-making would work, And it looks like it does. So when I posted this article to EngagingCities last week, it went in the Good Ideas File at the same time.
Here are some lessons learned from this pioneering project so far.
1. People participate in a constructive way.
A substantial number of people are really eager to participate, when they are given a meaningful opportunity to do so. That opportunity needs to be something that they care about. And there needs to be a plausible promise: meaning, their participation must lead to something.
2. The crowd is not delusional about its potential impact on the law.
The crowd is hopeful, but realistic. They understand that one idea or opinion may not count so much at the end. And there are hundreds of other opinions too that need to be heard, and the end-result, the law, will be a compromise of many perspectives.
3. Crowdsourcing creates learning moments.
As the participants exchanged information and arguments on the crowdsourcing platform, they learned from each other. As one interviewee, who participated in crowdsourcing said:
“I’m somewhat surprised to see that the online process serves as a way to add to the participants’ knowledgebase and correcting their incorrect perceptions. I had read carefully the current law and the expired bill, and I realized that quite many participants didn’t have correct understanding about the terms about the law and its implementation. But, in many conversation threads these misconceptions seemed to transform into correct ones, when somebody corrected the false information and told where to find correct information.”
4. Crowdsourcing as knowledge search
In our case, we were crowdsourcing ideas to improve the law, not delegating ultimate decision-making power to the crowd so the problem of legitimacy is not necessarily acute. Because the focus was on idea and information collection, an idea didn’t gain more weight from being voted on multiple times.
5. The crowd is smart.
Based on the idea evaluation results collected so far, we conclude that the crowd – at least this specific Finnish crowd – is smart. The evaluation took place on a new crowd evaluation tool built by David Lee at Stanford University. Each participant reviewed a random sample of ideas by comparing, ranking and rating them. Based on the evaluation analysis, it seems that the crowd preferred commonsensical and nuanced ideas, while rejecting vague and extreme ones.
6. Minority voices were not lost.
What proved a very interesting and successful method to analyze the evaluation results was clustering…. Being able to identify a minority cluster is important because it helps us analyze the results of the crowd evaluation at a more detailed level. With clustering, the voice of the minorities is separated out from the majority, allowing us to hear the minority. The use of this technique can also function as a motivating factor for minorities to participate in online crowdsourcing efforts, because we can promise them that their voices won’t be simply drowned out by whatever majority emerges.