Technology is playing an increasingly prominent part in civil society efforts to strengthen citizens’ voice and hold governments to account in Africa. But to make a technology-driven project successful, picking the right tool is essential.
Even though the tool itself will play only a small part in determining whether a project works or not, the choice is often a critical one. Choosing poorly can frustrate projects, sucking away time and money that could have been spent elsewhere.
However, we know relatively little about how organizations in sub-Saharan Africa choose and implement information and communication technology (ICT) tools.
To find out more, the engine room has been working in Kenya and South Africa, with the civic arts organization Pawa254 and the Network Society initiative at the University of the Witwatersrand on a research project supported by Making All Voices Count.
We wanted to draw out common problems and solutions that would help organizations, funders and support organizations to plan activities in future, but we also wanted to test whether we could create something that would help organizations select the right tools themselves.
This blogpost explains what we’ve done so far – and what’s next.
Understanding how a tool gets chosen
We started with an online survey to assess how Kenyan and South African civil society organizations working on voice and accountability choose ICT tools. The survey asked for information about the organization’s size and technical capacity, as well as what tools they used and how confident they felt about selecting tools.
Because we were more interested in how a tool gets chosen than what the tool itself is, we asked about a range of tools: hardware as well as software, and tools for an organization’s general operations as well as tools used in specific projects.
Using the survey responses to give us a general picture of the field, we ran 38 in-depth, in-person interviews with people who had recent experience of choosing tools for voice and accountability projects. We deliberately targeted people from a mix of sectors, aiming to hear from organizations with differing levels of professionalization and experience with technology.
We wanted to understand how organizations actually made decisions, noting that decision-making can be messy, doesn’t always follow linear processes, and is often influenced by personal relationships.
What did we learn?
Most organizations that we surveyed were confident about their general capacity to choose tools: most survey respondents said that they kept up with which new tools were available, while 78% of all organizations thought they had the knowledge and skills to choose appropriate tools. Organizations were often fairly confident that they could choose the right tool for their project (of the 144 South African respondents involved in a voice or accountability project, only 37% found it difficult to choose a tool, and 29% found it difficult to use the tools that they eventually selected.)
However, organizations were much less positive when we asked them about a time when they had chosen a specific tool. Less than 20% of the initiatives described the tool they had chosen as a success, with common problems including:
- The tool didn’t work as the organization expected it to – or it didn’t work at all.
- The tool’s intended users didn’t use it in the way the organization had hoped.
- The project took much longer or cost much more than planned (or both).
- The organization was unable to find a technical partner that it could work with well.
Organizations lacked knowledge in key areas: many started with very limited information on what they needed the tool to do, or had little idea of what tools could do what they needed. Very few organizations had much detailed knowledge about how tools worked before they chose them.
This may be partly because they rarely conducted significant amounts of research. When they did do research, they hardly ever looked at the tool-users’ needs as well as what tools were available. When we asked initiatives what they would do differently if they ran the project again, gaining greater knowledge about users or tools were among the most common responses.
Organizations regularly had problems in finding resources or technical support, but were not always satisfied with what they found. They often relied on a technical partner for selecting, building or implementing tools, but this often led to difficulties. Many respondents expressed frustration over the limited number of partners available to them, while technical partners that were available sometimes had significant tool biases or limited understanding project contexts.
Tentative tips for success
There appear to be a handful of simple strategies that can make the tool selection process more efficient:
- Test tools before committing to them. There seems to be no substitute for trialling a tool. In many cases, this appeared to be the best way to determine whether a tool was the right one. Trialling tools (by the people selecting the tool or – better – with actual users and audiences) can avoid unpleasant surprises, saving time and resources in selection and implementation.
- Compare multiple tools. The vast majority of initiatives surveyed reviewed only one tool, and almost none reviewed more than two. Comparing differences in tools can be an important way to identify hidden challenges in implementation.
- Conduct user research and tool research. Explicitly allocating time to background research on tool users/audiences and on available tools can make a tremendous difference, especially if these issues are researched in tandem.
Funders can play a role
Drawing on our preliminary findings, we can suggest some ways that funders could better support organizations.
- To address the lack of information, resources and technical partners available in-country (and significant tool biases among available technical partners), funders should consider supporting the development of stronger local networks that can provide help in using and selecting tools; or supporting organizations that can provide tool-agnostic support.
- Funders, and training or tool-building organizations should beware of the risks involved in making their support dependent on using specific tools or partners with which they are associated or particularly familiar. They should also allow for flexibility in the choice of appropriate tool, including factoring in potential for changes along the way.
- Key factors to look for in organizations’ project design include a clear, realistic understanding of prospective users, and whether the organizations taking on complex tools have the capacity they need to manage them.
Turning tips into a structured framework
Perhaps most importantly, organizations thought the tool-selection process would be much easier and smoother than it turned out to be. They often chose less effective or appropriate tools because they couldn’t predict which problems they might encounter when implementing the new tool. In short, they didn’t have the prior experience or information to prepare for risks and avoid these potential problems.
But we didn’t just want to identify problems, we wanted to find out whether we could create a method of helping organizations to choose better tools. So we developed an interactive online framework, consisting of a four-step process that guides an organization through the tool selection process in a methodical way. The guide’s steps are directly based on the findings from our research – and we’ve included real-life examples from our research.
Each step is divided into 5-10 smaller tasks, which the user can complete whenever they have time. The framework records decisions as the user goes through the process. Then, it allows them to create a PDF document that summarizes tool specifications, provides text for tenders and proposals, and produces a record of all the research that went into choosing the tool for the project.
We’re piloting a first version of the framework with selected initiatives in South Africa and Kenya to see which aspects they find useful – and which they don’t. A more detailed description of our research findings and recommendations will be published in the coming months. Stay tuned!