New research on the ethical implications of predictive analytics for children’s outcomes

Cathy Richards

Over the past several years, we’ve seen predictive analytics play an increasingly integrated role in everything we use–from streaming services which help us find the perfect show to phone assistants that try to determine our needs based on the time of day. Outside of fulfilling consumer needs, predictive analytics have had early promising applications in civil society as well. Forecasting Covid-19 surges, for example, can help hospitals and communities plan and manage resources. Some of these models use valuable contextual data such as demographic information, environmental variables, along with GPS tracking information in ways that would be too complex for traditional statistical models. Other predictive models have been used to strengthen early-warning systems. For example, researchers evaluated the use of satellite imagery to predict whether an area may be undergoing active violence and if so, how many individuals may be displaced by it. This knowledge would allow humanitarian organisations to prepare adequately for a surge in assistance before it is needed. Additionally, predictive machine learning models have been used to infer poverty data that is often lacking in the sector. Using high resolution satellite imagery combined with survey data available in the public domain, researchers used a neural network model to predict poverty levels at the cluster level. 

Across various use cases, however, there have also been negative consequences such as revelations of private information and inaccurate predictions. For consumers, these consequences of predictive analytics can feel mildly invasive or annoying. But these same components can also put persons at risk or directly harm vulnerable populations. For example, predictive models which use public service data unfairly target those that rely on public assistance. In one case, individuals have been charged with owing debts to the government which were incorrectly determined. Other models, such as the one used by the British government to estimate test scores, may make decisions that can forever alter the course of a student’s future. 

In collaboration with UNICEF, we are starting a new research project to assess the use of predictive analytics for children. The aim of this project is to use established human rights, data ethics and Responsible Data for Children principles to create guidelines around the responsible development, application and evaluation of predictive analytics on children.

Our research will consist of in-depth desk research, interviews and analysis of relevant case studies. We plan for our report to provide clarity around the technological methods used for predictive analytics, current and proposed uses, ethical considerations, risks and mitigation strategies. Further, we aim to explore the parameters that may define appropriate use of the technology, as well as potential future applications. We also intend to prepare a brief that distills the key risks and benefits of the technology in order to create a decision tree to facilitate the work conducted by practitioners. 

We hope that our learnings can help build responsible ways forward, and encourage more reflexive thinking in relation to the use of predictive analytics for children – especially in what are often complex and sensitive contexts.

Want to get involved?

We would love to talk to you if:

  • You have case studies specific to children or young adults to share with us.
  • You’re a specialist in children’s issues (child rights, education, health, etc).
  • You work in predictive analytics and can speak to the intricacies of artificial intelligence, specifically machine learning.
  • You can speak to the ethical components of either of the above.

Get in touch with cathy[at]theengineroom.org if you’d like to share your experiences. If you’d like to learn from ours, keep an eye on this space for upcoming calls and publications. You can also get lightweight support through our light-touch support programme.

Image by Hans-Peter Gauster via Unsplash.

MORE