Case study ● Research

Alidade – Choosing technology tools

Picking a technology tool is only a small part of a project – but getting the right one can make a crucial difference. With partners in Kenya and South Africa, we led a two-year research study investigating how transparency and accountability projects choose technology, and identifying strategies that could make a difference to projects’ success.

Identifying the need

More and more research is being done on what makes transparency and accountability projects likely to succeed. Meanwhile, organisations are increasingly making technology tools central to their project strategies. But there’s relatively little evidence showing how those technology tools get chosen, and even less to suggest whether some selection processes are more effective than others.

We wanted to start collecting this evidence. We’ve seen through our direct support work how important tool selection can be: a tool that does exactly what an organisation needs can significantly increase a project’s impact, while picking the wrong one can waste valuable time and resources. We wanted to find out how organisations approached their decisions, and whether it affected their projects.

The right tool for the job

So, we partnered with Mtaani Initiative at Pawa254 in Nairobi and the Network Society Lab at the University of the Witwatersrand in Johannesburg, to design an in-depth comparative research project supported by Making All Voices Count. We wanted to draw out common problems and solutions that would help organisations, funders and support organisations’ planning, as well as trying to create practical resources that could help support that process.


The right tool for the job

Identifying a research question and designing research methods.

Conducting a survey and 38 in-depth interviews with organisations in Kenya and South Africa.

Using different formats to communicate what we found, including short heuristics, peer-reviewed articles, practical interactive tools and in-depth research reports.

Understanding how tools get chosen

We started with an online survey to assess how Kenyan and South African civil society organisations choose technology tools, and how they felt about the process. Because we were more interested in how a tool gets chosen than the tool itself, we asked about hardware as well as software, tools used for general operations as well as for specific projects.

Using the survey responses to give us a general picture of the field, we interviewed staff from 38 Kenyan and South African organisations that had recently chosen a digital tool for a project related to transparency and accountability. We deliberately spoke with people from a mix of sectors, aiming to hear from organisations with differing sizes, levels of professionalisation and experience with technology.

We asked them why they had chosen a particular tool, how they chose it, and if they were happy with the results. We wanted to understand how organisations actually made decisions – noting that decision-making can be messy, doesn’t always follow linear processes, and is often influenced by personal relationships.

"The Engine Room have worked with great integrity and commitment to maximise the impact of their research in influencing positive change in the world."

- Duncan Edwards, Institute for Development Studies

What difference does it make?

Less than a quarter of the organisations we interviewed were happy with the tools they had chosen. They often found technical issues that made the tool hard to use after they had decided to adopt it, and half found that the tool wasn’t used as much as they’d expected or in the way they had hoped.

These problems were often linked to the way that organisations chose tools. Most did very little research on their intended users, the technology options available and the problem the tool was expected to solve. More than half the organisations built a tool from scratch without checking if existing tools could do the job, while few organisations tested out a tool before choosing it (particularly with the tool’s intended users).

Sharing what we found

We wanted to share the findings in different formats, for different audiences and people who learn in different ways. We’ve discussed all this in much more detail in a short summary for people curious about the findings (online and as a small pdf), as well as a full research report (2.5MB pdf) and a peer-reviewed article in the IDS Bulletin for researchers.

As well as identifying problems, we wanted to help organisations make better choices next time. So, we developed six rules of thumb (or heuristics) for choosing a tool, as well as the Tool Selection Assistant, an online guide to help you choose a technology tool that asks questions, gives examples from our research and links to resources (it’s also on GitHub so others can host it on their own server and modify it for other sectors or needs).


Home page of the Tool Selection Assistant

Plenty more research can be done in this area. We’re continuing to discuss and write about the findings and to develop the Tool Selection Assistant in partnership with other organisations working this area. And. of course, there’s always a need to do more research to test whether these findings apply beyond Kenya and South Africa.

Related projects