3Vs Crowdsourcing Framework for Elections

By Editor
iHub
  Published 22 Apr 2017
Share this Article

Over the past six years, the iHub has conducted many research projects in technology use and capacity in the East African Region, under the thematic areas of; Innovation & Entrepreneurship, Governance & Technology, and Mobile & Web

The studies have provided value to the thriving tech scene by understanding various customers needs and tech requirements; market opportunities to be seized and current and future trends both locally and globally.

As the country is preparing for the upcoming general elections in August, we would like to share the research that was done in 2013 “The 3 Vs Crowdsourcing Framework for Elections (Viability, Verification, Validity)”. The findings from this research are still relevant today especially now that we are in the election period in Kenya.

We pride ourselves on surfacing the most relevant information to influence the decision making of our technology stakeholders.

In 2013, the iHub conducted a project that looked to develop a framework for validation of crowdsourced information. The research looked at a commonly held assumption that crowdsourced information (collected from citizens through online platforms such as Twitter, Facebook, and text messaging) captures more informa­tion about the on-the-ground reality than traditional media outlets like television and newspapers. We used Kenya’s General Elections on March 4, 2013, as a case study event to compare information collected from the crowd with results collected by traditional media and other sources.

The major aims of this study were:

  • To assess if crowdsourcing information is viable in the Kenyan context.
  • To understand what information, if any, Twitter provided beyond traditional media sources, and other crowdsourcing platforms, such as Uchaguzi.
  • To understand if mining of social media data might be useful for traditional media.

The findings from the research include:

  1. ‘Passive Crowdsourcing’ is viable during the elections in Kenya. We gathered approximately 2.57m tweets over the period March 3 – April 9, 2013, from which we filtered 12,000 ‘newsworthy’ tweets, which represented 96 separate incidents. The analysis was performed using standard computer hardware and open source software. Twitter reported all incidents related to the election that was reported on traditional media, and in addition provided a large amount of data on the election that was not published in the traditional media.
  2. Twitter breaks news during elections In our comparison of different information sources, we examined their time profiles and compared them to one another. By looking at the lead and lag relationship between when news breaks in the traditional media and when it breaks on Twitter, we found that Twitter either reports on the same day or leads the traditional media reporting. When Twitter leads, it is by a margin of usually one day. Given that so many incidents happen within an election period, lead time of one day can be quite important. This finding highlights that Twitter’s value stems not only from increased incident coverage but also from its ability to offer information in real-time.
  3. Mining of Twitter data without machine learning is not feasible We have found that Twitter provides access to useful, first-hand, eyewitness information, not available in other media in near real-time. However, we also found that extracting this information needed data-mining techniques from the field of Machine Learning, requiring technical expertise. Simple searching of the data was not feasible.

As a result of the work conducted for this project, a draft Framework – the ‘3Vs of Crowdsourcing (During Elections?)’ (Viability, Validity, Verification)  was created, with the hope that it will help new and existing crowdsourcing deployments, media organizations, open data platforms, and other similar operations during election to better be able to assess if crowdsourcing during a particular election in a particular country is indeed a viable way to gather verifiable information.

Download the article here

Download the Final Report here.

Download the 3Vs Crowdsourcing Framework here.

Download the 3Vs Framework Visual here.

Watch the 3Vs of Crowdsourcing video here

comments powered by Disqus