Archive

Posted 18 October, 2013 by Christopher Wilson

Researching the Researchers: An Amnesty Survey to Support Better, Safer Tech in Human Rights Investigation

Back in March we started talking with the technology and human rights team at Amnesty International about how AI researchers are using digital and mobile tools to document human rights abuses. Amnesty has been pushing boundaries in this area, with flagship initiatives like the Science for Human Rights project. It is clear that digital and mobile tools can aid human rights research. At the same time, new tools pose security challenges that can put researchers and partners at risk.

Collecting information on the actual use of tools was a first step for AI to make smart decisions about how to better support researchers in using digital and mobile tools strategically.

We decided to help out and over the past 6 months we worked with the technology and human rights team to design and implement a survey to better understand research practice, and a plan for using that information to provide researchers with support for navigating technological tools and a new media environment. It has been challenging on a number of fronts, and we are now preparing to share the first set of results within the organization.

This blogpost is a description of the process so far.

In the Belly of the Beast

Amnesty is a large and complex institution, with a presence in nearly 50 countries and a nearly 500 person secretariat. It’s also perched on the brink of momentous change. An ambitious restructuring process is currently underway that will move the bulk of Amnesty’s research and policy work out of the London-based secretariat and into regional offices. Intended to bring operations closer to ground-staff and in-country realities, this process has been demanding for secretariat staff, and researchers especially, whose work tends to focus precisely on the movement of information between countries and the secretariat.

This context makes it especially challenging for the organization to keep up with the day-to-day practices of its 80 or so full time researchers. All researchers receive input and training from a Research Support Unit, but their day-to-day work is spread across departments. Each researcher is working on different countries and issues, pursuing different long-term and short-term goals and has a different level of technological awareness and literacy. To get a better handle on their work and needs, it made sense to survey them directly.

Framing the Questions

After working remotely with the technology and human rights team to understand the institutional and strategic context we came to London to meet with key people at management level across the research, IT and information resources teams. We hoped these meetings would help get middle management on-board, and in turn to get the survey filled out. They were also useful for informing survey design and identifying smart indicators, especially because it helped to uncover different (and sometimes opposing) informational needs.

Lastly, and perhaps most importantly, this phase included a small workshop with management. The goal was to find out what Amnesty wanted to know and what it would do with that information when they had it. We used this workshop to brainstorm priority institutional needs for information, and then identify the specific actors and practices who could provide information to answer each of those needs, and the concrete processes in which that information might be used. Mapping out all three of these levels was important for understanding differences in how information and processes are conceptualized by different actors within the organization. It was also key to help us design questions, indicators and data points that, when churned out of the survey machine, made sense to the people who were to use them, and could quickly be incorporated into policies and decision-making.

One challenge that accompanies this approach, however, is the sheer number of questions one wants to ask. This is a problem for survey work generally (small surveys are hard to build because researchers want to know everything, long surveys don’t get filled out, or annoy respondents and risk poor data), but it’s worse with more cooks in the kitchen.

To address this (and all the great input we got form workshopping and meeting with staff), we started survey-building with all of the objective and subjective indicators the survey team was most interested in, then mapping them out against the priority issues flagged by the different departments and by management. Wherever there was overlap, we structured questions to produce indicators that would speak directly to policy or training needs. This also proved a helpful tool for determining which questions had to be cut or combined.

The final survey was medium-sized (less than 50 questions) and covered a number of issues, including organizational context (essentially demographics), common practices for collecting, sending and storing information, and self-evaluations of researchers’ own skills and resources. Given the survey structure (lots of branch logic) and content, a web-based approach made the most sense. We didn’t want to host it on a third party server due to the sensitive nature of the information we were collecting. So we built it in Gravity Forms, and hosted it securely on our website so that Amnesty could maintain full control over the data. Gravity Forms didn’t do everything we hoped it would (we especially missed ranking and multiple choice matrix formats for questions question). In hindsight, we probably would should have gone with LimeSurvey, a wonderfully flexible, free and open source alternative that requires just a touch of technical and configuration know-how.

Rolling it Out

It took us a couple of months to get a significant number of responses to the survey due to travel and internal priorities within Amnesty. In hindsight, it is also clear that incentives for researchers to fill out the survey were not as strong as they could have been. Had the survey been integrated into other, organization-wide processes, or been more actively promoted by middle management, we might have gotten more responses faster. We have now received enough responses, however, to begin looking carefully at how to use them.

Amnesty’s restructuring presents a unique opportunity to directly impact information policy, practices in the field and training curriculum for researchers. But the restructuring also presents a danger that the survey will be steamrolled by other, more pressing issues, or will get caught up in the negotiations that inevitably accompany any institutional processes. To navigate this context towards the best possible impact, we decided to release results in short, targeted briefs. Each brief (three in total) will focus on a specific issue, starting with digital security practices among researchers. We hope that this will help to make fairly complex evidence an accessible and compelling basis for management decisions, while also giving voice and perspective to researchers represented.

We’ll be distributing this first brief on digital security within the organization this week, and then iterating with briefs on information management and research tools and strategies. The data and the briefs themselves won’t be made public, but we will be blogging more about the process. Uncovering how large organizations relate to novel contexts and tools and then using this evidence to support progressive organizational change is a complicated process. But we know that Amnesty is not alone in facing such challenges – nor in seeking novel approaches and solutions. We hope that sharing our materials and experiences can be useful for others. We’ll be back in a couple of months with some thoughts on what the survey results led to, what we learned, and what might be useful for other organizations in similar circumstances.

1 thought on “Researching the Researchers: An Amnesty Survey to Support Better, Safer Tech in Human Rights Investigation”

[…] for more on our collaborations with Amnesty International, see this post about our work to better understand how AI researchers are using technology so the AI network can […]

Leave a Reply

Your email address will not be published. Required fields are marked *

Related articles