I’ve just returned from Barcelona, where mySociety brought together the civic tech research community for the Impacts of Civic Technology Conference (or #TICTeC). It brought up a lot of thoughts and things I want to delve more into, but for now, here are some of the questions that I have to start with.
How do we talk about civic tech in areas where the state is the main adversary?
Lots of the discussions had an underlying assumption that civic tech should involve government in some way. Sometimes it involved using technology to get information to governments more easily; make governments more responsive; or to allow citizens to hold governments accountable.
But Maya Ganesh’s work with Jennifer Schulte and Jeff Deutch looked into tech for transparency and accountability in South Africa: a country with a brutal history of state violence. In that case, and in many other countries, civic tech could be much more about citizens self-organising, than about establishing or strengthening citizen-government ties.
Discussions only lightly touched upon “social cost of complaining”, but being worried about being seen as a troublemaker is a barrier to digital participation that I’ve seen across the world; like in Venezuela, where your biometric data is required to shop for your groceries on assigned days, as well as to vote. Though the data is theoretically held by different entities, in the eyes of many people the two are linked – and people need to eat. If you don’t want to draw attention to yourself or your family, especially in politically restrictive countries, “participating” might be the last thing you want to do.
Um, what’s happening to all this data?
Almost all of the projects at TICTeC involved gathering data in some form, often very personal data: voting preferences, addresses and names, movements across a city. Some projects called for “better analytics” to be able to understand their user base better – and though that’s understandable, there’s a balance to be made in terms of gathering all the data, and gathering danger that could put users at risk.
In one talk, a researcher from a US-based university said: “Researchers give our time for free – we get paid in data”, He then went on to talk about using mobile metadata in Yemen, which can reveal all sorts of things – and in fact in Yemen, adversaries use mobile metadata to plan targeted drone strikes. If an academic department is gathering and getting access to that kind of data, what kinds of ethical checks and balances are there? And who else gets access to that data at the same time? As one participant highlighted, ethical review boards are unlikely to be able to manage the new responsible data challenges that researchers face.
For me, a first question to ask might be: would they be able to do that same research in the country in which they are based? If not, that’s a first red flag that the work might not be ethically sound, or need revising.
How do we make ‘research’ findings useful to practitioners – and to the people who feature in that research?
In my experience, practitioners working in NGOs don’t have time to read white papers – and don’t really **want** to. As Duncan Edwards of the Institute of Development Studies (IDS) raised on the first day, those papers are often written in a highly inaccessible way – using a theoretical approach or jargon that is exclusive to academia. Either way, it doesn’t discourages the non-academic from diving into them. In practical terms, there are few incentives to encourage the time- and resource-challenged practitioner to dive into them.
In a similar vein, Rosie McGee from IDS talked about how evidence isn’t turning into action, and called for researchers to think harder about how to get practitioners to use their research. But at this, I couldn’t help but remember the wealth of research out there on how our human decision-making is often irrational. Even if practitioners had read those papers, it’s by no means certain that the findings would have influenced or changed actual practice. Instead, perhaps we should be paying more attention to research into how adults actually learn.
How do we balance “go where the people are”, and potentially getting locked into commercial platforms?
Facebook was mentioned a lot over the past two days, and there was a really interesting talk from two data scientists who work on civic engagement at the company. Their reach is almost unparalleled, and their potential to influence elections is huge (if not a little scary). So it makes a lot of sense to be thinking about how we can use and reach people on platforms like Facebook.
But we didn’t talk about the power balances in play there — essentially, if we’re designing civic tech initiatives that depend upon, say, Facebook, we’re putting ourselves in a position where we are totally dependent upon Facebook’s mission and decisions. Although this probably lies way out of their civic engagement team’s mandate, Facebook’s decision recently to prioritise ‘paid’ pages is reportedly having a negative effect on nonprofits’ ability to reach their supporters. Their changes to the algorithms governing the main Newsfeed could change a lot about how those civic tech initiatives might actually roll out, and we as a community have absolutely zero say on how that happens.
Where did the power + the politics go?
Somehow, we went from very few mentions of ‘power’ in Day 1, to an overwhelming use of it in Day 2. However, the values, beliefs and principles baked into civic tech were mentioned much more rarely in the sessions that I went to. It might be a bit meta, but what kinds of values are actually embedded within civic tech?
I love Laurenellen’s work on ‘Build with, not for’ in civic tech projects, and she’s also done some interesting thinking about who civic tech is actually for, and what principles the civic tech community should stand by.
Broadly speaking, we seem to have come to an understanding: the problem is rarely the technology, the problem is the people, or the political culture. One of the things I enjoy most about working at the engine room is that despite our tech and data focus, when I start speaking to people about their ‘tech’ problems, I end up talking about culture or organisational strategy.
So, what next? How can we start to acknowledge that prioritisation of people over tech more explicitly within the civic tech community?
Thanks so much to Gemma and the mySociety team for organising the event – we’re already looking forward to next year’s!