A couple of weeks ago, the Responsible Data (RD) mailing list saw a flurry of activity – 72 emails in three days, centred around the news of the World Food Program’s (WFP) partnership with Palantir Technologies. The list has become a valuable space for those interested in responsible data issues proactively examining the unintended consequences using data in the social sector, developing best practices to protect people’s rights. The discussion highlighted for me how valuable a space for discussion, respectful disagreement – and potentially opposing perspectives – really is, particularly on a topic like this.
The discussion culminated in the production of an open letter to WFP, outlining the RD community’s concerns about the lack of transparency, potential risks to food aid recipients that WFP serves, bias in the models Palantir might use, and accountability processes within the partnership. The letter serves not just as direct advocacy to WFP, but also to other humanitarian agencies who might be considering working with similar private sector companies, to show what the standards are that we all expect from each other in this space. The value of such a letter – now signed by 69 organisations and individuals – was also a speedy show of solidarity from groups that, in many cases, may not have worked together before, and is the first step in what we hope to be a broader advocacy campaign on the topic.
Strength in diverse perspectives
The nearly 1000 people on the RD mailing list come from many backgrounds: activists, computer scientists, academics, civil society practitioners and advocates, and more. Perhaps more important than the diversity of their professions are their differing associations: some are freelancers, others associated with large UN institutions, small civil society groups, or activist collectives. These different backgrounds create a great diversity of perspective, too.
The value of this diversity of perspective came to the fore during the WFP and Palantir discussions. We heard from people working closely with Foreign Ministries in other countries, from academics who had engaged in advocacy towards partners of Palantir’s in the past, and from humanitarian workers who rightly pointed out that this kind of public-private partnership is really nothing new – particularly those who had seen first-hand the use of privacy-invasive technologies in refugee camps in their countries.
What ‘due diligence’ really means
But it being nothing new is no reason for these kinds of partnerships to continue. A core value of ours at The Engine Room is that ‘our methods support our mission’. By that, we mean that the way in which we achieve our goals needs to match up with the goals themselves. We believe this is true – or should be true – for others working in the social sector, too, in order for our work to be taken seriously and to not create a disconnect between what we fight for versus what we do. At its heart, that’s what the Responsible Data community is all about – figuring out how to hold ourselves to those high standards when we’re working with data in new and different ways.
The tension brought up by the partnership between WFP and Palantir is, of course, much bigger than just one UN agency’s decision to work with one company, and ultimately, has implications for the humanitarian space and beyond. Addressing the implications means considering how partners are chosen, examining what ‘due diligence’ really means and recognising that choosing whether or not to practice responsible data is a political decision that will result in trade-offs.
In today’s world of surveillance capitalism, due diligence for organisations who hold sensitive data means more than making sure their potential partners check the right boxes of capacity, expertise or experience. It means considering the business models, data practices and values of the potential partners, too. It means being open to acknowledging that data practices designed for certain companies or institutions is fundamentally not aligned with humanitarian approaches. In some cases, those data practices can be difficult to understand, and even more difficult to audit, but that may be a sign in and of itself. A lack of transparency around what a company is doing – be it in terms of data, their work or their projects – already says a lot about the values of the company themselves.
Integrity in the face of complexity
The recent discussions on the RD mailing list also highlighted how complex a partnership like this one is, and how difficult it can be to understand, particularly given the lack of information around it. On one hand, of course, humanitarian organisations often hold a huge amount of data, and that in itself must be a massive challenge to keep it secure and well-managed. I’ve long been an advocate of thinking of personal data as ‘toxic’, in that it is necessary to handle it with care, to minimise where possible, and a ‘spill’ is almost impossible to clean up. In that vein, they take care of a great deal of toxic data assets, which is no small feat. Humanitarian organisations also operate under extremely difficult conditions, both physically, digitally, and often financially.
On the other hand, the position that humanitarian organisations are in means that they must hold themselves to extremely high standards because the people they serve are often not in a position to hold them accountable directly. Mentions of recipients of food aid as ‘customers’ by Palantir’s staff highlighted for me how that can be misunderstood by those outside the sector.
Customers are people who choose to purchase a good or service from a vendor, whereas recipients of food aid are people who have no choice but to receive food from an agency in order to stay alive. That hugely asymmetric power dynamic should be reflected in the way humanitarian services are provided, and how their data is managed, by making them thoughtful, intentional and carried out to the highest standard possible as default.
This is not about the WFP or Palantir, but about ensuring that exploitative data practices have no place in civil society work, be it through pro bono partnerships, in-house development or paid contracts.
That bigger issue is one that we hope to work on in the future by supporting decision-makers at large institutions to make decisions about their technology and data that match with their values and mission; and working with the RD community to establish not only best practices, but also ‘red lines’, beyond which responsible data practices are impossible. If the discussions mentioned in this post are of interest to you, please sign up to the mailing list, or get in touch with us.