As part of The Engine Room’s process for designing a new organisational strategy, we spent some time rethinking how we want our research to inform the intersections of social justice activism and technology in the coming years. This post explores how our new strategy will impact our research work. For more on how our strategy ripples across our work, read Zara’s earlier post and keep an eye out for upcoming updates.
As with our organisational focus, our research and learning work isn’t fundamentally changing. Over the past years, we have built a strong foundation of delivering carefully designed research products in a wide range of areas, such as tech for human rights, environmental justice, legal empowerment, anti-corruption, responsible data and many more. In our research, we combine hands-on experience from our direct support with mixed research methods so that we can understand where technology and data can make the most difference for activists and civil society.
Moving forward, we do, however, aim to prioritise research and learning that specifically informs and strengthens the fight for social justice–by providing critical analysis, shifting narratives, and suggesting context-specific pathways for how civil society can use tech and data in strategic, effective and responsible ways.
This refined approach, led by our Research + Learning Team, means that we will situate all our key research questions about the intersection of tech, data and civil society within a justice framework that examines or surfaces power dynamics. In other words, our research will explicitly look at who benefits most from specific tech and data solutions and how responsive these solutions are to the needs and concerns of local communities, especially those with the least power. We will continue prioritising the voices of people who stand to be most affected and exploring how the design and implementation of our research processes can better centre previously oppressed and excluded groups.
Here are some of the questions we seek to answer through our research and learning work.
Is tech (part of) the solution?
Through our work we see activists around the world taking advantage of emerging technologies in a number of ways–through demanding (or creating) access to data that affects their communities, presenting information in new and accessible ways, using emerging technology solutions to document rights violations and power abuses, creating new digital communications channels to share their knowledge and strategies, and more. Our research aims to understand the myriad ways in which tech and data can open up new ways of working toward change.
But sometimes tech and data aren’t the best solutions for civil society. Often, experimenting with untested technologies can cause more harm than good. Many humanitarian organisations, for example, are angling to be leaders in implementing innovative solutions, such as biometric technology, to solve problems that are mostly related to limited resources.
Part of the problem is that within civil society, seemingly normative frameworks like ‘tech for good’ can promote the idea that all we need to solve societal and political problems is the right tech, which can make organisations prioritise intentions over reality, focusing so much on the ‘greater good’ that real harm is ignored or tolerated. Hype around emerging technologies can cause actors–including those with otherwise good intentions–to introduce tech tools into situations for which they weren’t designed or where the need for them is assumed rather than proven. Additionally, the tech and data landscape changes fast, and there is an abundance of information available. As a result, civil society often cannot keep up with the latest developments of working with new technologies.
Our applied research carefully considers the various opportunities, challenges and purposes of a certain technology solution for diverse stakeholders. We surface the needs of diverse communities to understand if, and how, tech responds to them. We identify patterns across different sectors to examine the impact of theories and practices on various fields of work and communities. We also keep up to date with the fast-changing landscape of tech and data as it relates to social justice work with a critical but constructive lens.
What are the unintended consequences?
Sometimes we see civil society adopting tech that has the potential to harm the communities they serve. These organisations may determine that the benefits outweigh the risks, be unaware of the risks or have few alternate options. Fortunately, we see that social justice activists increasingly want to align their tech choices with their values. Implementing responsible data practices, choosing open source solutions, reusing and adapting existing tech, working with community networks and exploring the ethics around emerging technologies can go a long way toward ensuring more responsible alternatives are available. But many times activists don’t know where to start, in part, because there is a lack of research around the impact of numerous tools and approaches.
This research gap on the impact of tech makes it difficult for civil society to understand the options available to them and to advocate for anything related to tech. To address that, we explore how various technologies affect different communities, especially those with the least power, in order to make recommendations that steer civil society and funders away from intrusive tech and toward people-centred alternatives. We are adamant about the need to hear and amplify local voices and individual experiences, recognising that knowledge lies in multiple spaces and in multiple forms.
Is civil society thinking enough about data?
We’re seeing growing interest in the concept of responsible data, with organisations developing policies and practices to protect people’s data and to prioritise consent, but we also have noticed a climate of increasing risk around data. The Covid-19 pandemic has forced many organisations and activists to transition quickly to remote work, which means they must rely on tech for nearly every aspect of their work. The need for efficiency, quick communication and reliable platforms that meet the needs of people in various environments can lead to reliance on platforms and tools that do not follow responsible data principles. This is happening as we experience a rise in authoritarianism and in both state and corporate surveillance, which makes the data civil society collects and creates ripe for misuse.
Through our research, we surface the ways data policies and practices can harm people and reimagine data approaches that protect and empower. We collect data for research and evaluation, and we practice what we preach when it comes to responsible data. We prioritise participatory frameworks for research and evaluation as much as possible, enabling affected communities to contribute to research design and choose what data they wish to share as we measure change. Our hope is that helping activists understand responsible data practices will enable them to make the most of remote work and organising, strengthening rather than limiting their ability to address social justice issues.
One of the strengths of The Engine Room, including the Research + Learning Team, is that we are distributed, spread out on different continents. Currently, the research team includes team members in Belgium, Brasil and the United States. We draw from our own diverse backgrounds and the movements and networks in which we are embedded. This enables us to emphasise mutual learning, sharing and building on knowledge and strategies we can explore in different contexts.
We are local activists ourselves, working closely with fellow activists and researchers to strengthen movements. It’s rare for research findings to be co-created by and shared with the communities studied, which means those that might be most in need of the findings for advocacy purposes are excluded from conversations about tech and data. This is why our approach is critical: we identify patterns while prioritising local context, including communities in the work and engaging them on the findings.
The decisions civil society makes about tech right now are critical to sustaining social justice movements and protecting and uplifting communities at most risk of harm. We’re dedicated to providing the solid evidence, critical analysis and easy-to-follow recommendations activists and organisations need to make tech and data choices that reflect their missions and push back against power inequities.
Read more of our research to see if it resonates with your goals and get in touch with email@example.com if you want to talk about an idea!