Big data meets thick data

The tools that INGOs have traditionally used to make sense of the changing world are under strain. Donor reporting and legibility requirements increasingly call for quantitative indicators to demonstrate impact, which are ill-suited to the messy reality of many development challenges. And 2016 has shown that tools like polls, focus groups and surveys struggle to capture the sentiment and unarticulated needs of a population.

Initially hailed as a potential panacea to overcome existing limitations, big data implementation has proven to be challenging, if not, at times, harmful. Without proper handling and contextualisation, big data risks becoming deep fried data.

Paradoxically, then, the era of big data needs even more qualitative, granular knowledge of local contexts. A new approach is emerging, combining a growing interest in design thinking with the sector’s long-standing tradition of ethnography: the integration of “thin” big data with thick data. Development innovation labs are beginning to question what donors and governments see as “acceptable” evidence and are looking to bridge the quantitative vs qualitative divide.

In 2017, I predict INGOs will combine big data insights with user research and ethnography, thereby exploring the fuzzy “in between spaces” of data practice.

In particular: