Learning when to stop

Zara Rahman

Recently, at a workshop my colleague Laura and I were leading, a participant asked:

“But do you have any examples of people actually saying ‘no’ to new uses of data?”

We were workshopping decision trees designed to guide users to more ethical judgements around using geo-located data (together with the American Association for the Advancement of Science). As they used the decision trees, participants had noticed that quite a few paths led to the next action of “stop.”

For us, having pathways that led to “stop” was a crucial part of a tool that was designed to help people make better judgements about their use of data. If certain conditions are true about the context and the data at hand, the risks that will come from using that data outweigh the potential benefits. Particularly in the context of crisis situations, that means taking the bold decision to stop work on that path and to find another route to providing support.  

If certain conditions are true about the context and the data at hand, the risks that will come from using that data outweigh the potential benefits.

Knowing when to stop is a critical skill

The decision to stop is an important demonstration of leadership – of being able to assess sunk costs, potential risks and make a call on what is most important. Knowing when to say no is, in my mind, one of the most important skills that people working in non-profits can develop today.

In our tech-saturated world, civil society is bombarded by offers of help from technology companies with varying levels of good intentions, whose contexts and experience is different from our own. If we genuinely want to use tech to strengthen our missions, we need to stop seeing tech companies’ expertise as more valuable than our own. Instead, we must put contextual and critical thinking ahead of the hype and magical-sounding promises they might bring.

If we genuinely want to use tech to strengthen our missions, we need to stop seeing tech companies’ expertise as more valuable than our own.

Those working in advocacy, or fighting for social change, have a particular responsibility to put the people that they are working with and for, first. That means thinking creatively about potential risks before incorporating exciting new technologies with as-yet unproven consequences and having the courage to stop if unforeseen risks arise (which commonly happens, as our responsible data work has shown.)

It’s not at all new, but ‘progress’ in the scientific sense isn’t necessarily progressive towards a just and equitable future. Coming to terms with this fact – recognising that some technologies are a step backwards in that journey, no matter who they are used by or for what – is necessary if we want to see our missions become reality. Within civil society, we can’t control what technology is developed, but we can control how we choose to engage with it.

What could this look like?

I want us to rethink how we so often push for ‘innovation’ without questioning what values are hidden within that ‘innovation’. Instead, I’d love to see us supporting the advancements that intentionally and thoughtfully push for a future that we want.

There are some examples out there already, but I believe we could all do more to build a culture that rewards this kind of behaviour as a type of progress in itself. I want to hear more about how facial recognition technologies should be compared to plutonium, as Luke Stark outlines in this recent paper, and to see more cities ban facial recognition surveillance technologies, like San Francisco did just last week. I want to see Oxfam held up as a leader in the humanitarian field for their decision to establish a moratorium on using biometric technologies because of the huge risks of gathering immutable biometric data from the world’s most vulnerable communities. I want the government of Liberia to get the kudos they deserve for “uniquely refusing to share mobile network data with international organisations, despite repeated requests” during the Ebola crisis, in part due to their concerns about privacy and data protection.

I want to know how those decisions were made, the processes and workflows that went into testing assumptions and who made those calls. I believe that by sharing these decisions, and the processes behind them, we can all get better in using technology that truly supports our values and our mission. 

Image vy Guillaume Bourdages via Unsplash.

MORE

This site is registered on wpml.org as a development site. Switch to a production site key to remove this banner.