This is a follow up post on how the engine room helped Amnesty international to understand how its researchers were using and could use technology and data in their work. This post discusses outcomes and plans, see a previous post on methods and process here.
As an international human rights watchdog, Amnesty International has been in the information business for fifty years. From the days when there was never enough of it, Amnesty researchers now find themselves wading through an information glut to find reliable documentation and evidence of human rights abuses. Last year, we collaborated with the engine room to survey Amnesty International researchers about the tools and strategies they are using to collect, verify, store and make use of human rights documentation in this new information age.
In this post I will share some thoughts on the survey results and how the process has helped to spur broader momentum at Amnesty around one of the biggest questions facing the human rights field: what do we do about data? In a digital world where the possibilities of data are boundless, how do we collect it, validate it, secure it and use it responsibly to advocate for an end to human rights abuses?
Useful answers follow good questions
Every year Amnesty puts out an average of 25 research reports – from annual reports covering the globe to focused issues and geographies such as a recent report into US drone strikes in Pakistan. This is a big operation involving more than eighty full time researchers. While years’ worth of institutional knowledge guide research and documentation practises, the day-to-day work of researchers is by nature decentralized and contextual. Add the dynamic changes that technology can make, and we were finding it difficult to move beyond abstract assessments to gather grounded insights about how researchers were going about their work. We were also looking for ways to understand how our traditional methodologies were interacting with emerging digital tools and trends.
An example of this is citizen-derived data for use in documentation of human rights abuses. Despite the increase in attention paid to this new kind of data collection, it remained unclear to us how our researchers were using this kind of data to supplement their research. Similarly, we believe that increased surveillance and cyber attacks pose grave threats to researchers dealing with sensitive information, but we don’t yet know much about how many of these researchers actually experience such attacks, or how they assess their communication risks day to day.
The engine room’s support on this survey helped us collect precisely this information, and produced actual data from which to understand research gaps and needs.
Tracking the trends
A broad institutional survey is no replacement for focused consultations with individual teams, and often raises as many questions as it answers. But it was a good place to start and pointed us in the direction of practical next steps. For example, one striking finding was that, while digital tools are becoming commonplace in research, 90% of those we surveyed still relied on individual testimony collected in person above any other kind of documentation. By far the most important tools for research? Pen and paper.
At the same time it was evident that digital content has become an important new information source for researchers (for example, 58% of respondents had used YouTube in the course of their research in the year prior to the survey). Even so, the use of new types of documentation were firmly rooted in traditional methods. While using social media sources, most researchers expressed concern about the challenges of verification and reiterated they would only use documentation corroborated by contacts on the ground.
The survey also validated researcher concerns about digital security. It was evident from responses that earlier protocols and strategies for operational IT security were no longer adequate given the increased complexities and risks of individual communications. This helped us to make a stronger case that the IT strategies already underway needed greater emphasis on individual behaviours and risk assessment processes, not only protocols and technological solutions for field research.
What next for human rights research?
Overwhelmingly the data collected from the survey emphasized researchers’ desire for greater exposure to new tools and strategies that can enhance and secure their work.
With the insights gained in this process, we have been able to create a stronger vision for where and how technology fits into human rights documentation and research at Amnesty. We are currently in a process of integrating and expanding this work to form a cross-purpose unit with a mandate to catalyse new methods of practise and pilot tools for research, tactical campaigning and rapid response. The unit will not be an isolated team but part of a global ‘ideas network’. We hope this will enable it to work more closely alongside partners with niche expertise in the tech, data and human rights arenas, and to continue to learn and share experiences with local partners, many of whom face precisely the same challenges in human rights fact-finding.
If we can embrace these big changes as a global movement, the opportunities for human rights documentation and action are vast. Understanding the needs and actual practice of researchers was an essential first step towards this larger effort. Equally, as the survey helped to bring into focus, not adapting is not an option.