At the outset of our research project on digital ID, we aimed to design consent and data management processes that uphold the dignity and rights of the communities in our study. As we’ve noted, it is often unclear whether communities purportedly being offered “informed consent” actually have the space to exercise agency and meaningful choice over how their information is collected and used. In this blog post, we dive deeper into how we localised and applied consent and data management processes, and outline some of the challenges we faced.
Setting up the dialogue
As we wrote in an earlier post, we co-designed the research framework with our team of in-country researchers, leaving space for each researcher to design their own methods for data collection. As part of the research framework, we facilitated conversations with the team about what we mean by ‘informed consent’ and a responsible approach to data collection.
We provided researchers with templates for discussing the purpose of the research, how the data would be used and protected and how to obtain consent, encouraging them to speak of consent in common terms that communities would recognise. In each site, consent was given in writing or verbally, participants were given the opportunity to opt-out at any point, and researchers checked consent again at the end of focus groups and conversations to be sure everyone was comfortable with what they shared.
Contextualising conversations about consent
In our efforts to both contextualise consent processes and cultivate a shared understanding, we worked with the researchers to develop baseline agreements:
- Our starting point was that information would be anonymised and no real names would be collected in the research with the exception of authorities that served as key informants (unless they wished to remain anonymous).
- Any names gathered in focus group registration or consent forms would be stored securely and separately from all other information.
- Researchers would aim to hold focus groups and interviews in as private a space as possible.
- Where individuals were willing and safely able to share information in a way that would visually identify them (e.g. photos, video clips or diaries), this would be done outside of the focus group setting and with additional informed consent.
In addition to this baseline, we also reviewed on a case-by-case basis what would constitute as ‘sensitive data.’ For example, while in some countries any mention of a specific religion in a conversation wasn’t a cause for concern, in other countries, this mention immediately elevated the level of sensitivity of the data.
Challenges we faced
In designing participatory processes for consent and data management, there are only a limited number of ‘known unknowns’ one can prepare for, and we faced a number of challenges throughout our research.
When trying to communicate to participants how their data would be used, we learnt quickly that these conversations are complex and take time. To smooth this process, we practised data minimisation–collecting only the minimum data necessary. Because we could not be certain that people fully understood the implications of their participation being public, we decided not to use any photos or videos of people we interviewed. In many cases, participants had survived persecution and targeted violence or were living in an authoritarian atmosphere, and their safety was paramount. (To represent the context in our final report, and give readers an idea of the process from a visual standpoint, we worked with our designer to create illustrated representations of what the conversations looked like for the final report.)
We also faced challenges related to the physical journey of researchers and the data they held. When holding focus groups and in-person interviews, there was some difficulty in finding private, secure spaces to hold these meetings. In some cases, researchers were required to travel to locations with an elevated risk profile to conduct their research. We worked with all researchers to create a risk matrix and kept tabs on their physical security, prepared to put response protocols into place immediately. We also created digital security processes for them to follow, such as transferring data to our secure server, where we kept each research site’s data in separate password-protected areas.
For us, the process is always as important as the findings. We expected that issues like informed consent and privacy would feature heavily as we explored people’s lived experiences with digital ID systems. It was our responsibility to be sure we were not repeating some of the same problems we were researching, but instead exploring more just and fair ways of addressing those issues.
As always, we are curious to hear your thoughts about responsible and participatory research design and practices. Feel free to reach out via email at pverhaert[at]theengineroom.org or tweet at @engnroom.
All illustrations by Salam Shokor.