Data Justice Stories of 2018

Picture of Data Justice Project

Data Justice Project

This year was an important one for our project, having kicked off in the first quarter of 2018 with a lot of eagerness and enthusiasm. We’re equally raring to go in 2019, but before we do, we want to share some important stories from the past year that will inform our thinking in the next.

  • The year started with the Strava fitness app publishing de-identified maps of global running patterns, which inadvertently revealed the location of military bases in active conflict zones. Pointing both to the overlap between sectors and that emergent data visualisation can still have very real effects on an individual level, we have to think about a new type of unintended consequence.
  • Foreign Policy reported on a Chinese start-up’s deal with the Zimbabwean government to provide it with a facial recognition system. ‘By gaining access to a population with a racial mix far different from China’s, CloudWalk will be better able to train racial biases out of its facial recognition systems—a problem that has beleaguered facial recognition companies around the world and which could give China a vital edge’. These developments demand a global perspective on data and justice.
  • The ID story has surfaced in many versions during 2018. It’s important for humanitarian organisations to know who they are protecting so they can serve their needs and keep them safe. However, ID (and particularly biometric ID) is a new risk for people fleeing violence. ‘Is collecting biometric data always in the best interests of refugees themselves? And if not, should the data be collected in the first place?’
  • The tension between digital ID’s function as a way to serve and represent people, versus its potential to discipline and exclude, is an important issue for data justice. This year’s judgement on the Aadhaar ID system from India’s Supreme Court demonstrated two opposing points of view: digital ID can be an important tool in ensuring welfare goes to the poor, but is also an unparalleled tool for profiling.
  • 23andMe’s announcement that they would begin sharing users’ genome data with the pharmaceutical company GlaxoSmithKline for ‘research and development of innovative new medicines and potential cures’ was a stark, uncomfortable reminder that users of these services are, in fact, both the customer and the product.
  • 2018 was the year that tech employees revolted. ‘We cannot avert our eyes from the human cost of our business,’ one Amazon worker wrote. Is worker resistance going to become one of the ways we think about justice in relation to data technologies?
  • The unseen labor of artificial intelligence is one of the most important stories of our times. When the poor from the Global South are involved in creating training data, debates on data and justice are especially critical. We look forward to informed discussions on platform labor and data justice over the coming year.
  • Ann Cavoukian, the ex-privacy commissioner of Ontario, resigned from the board of Sidewalk Labs, arguably the world’s largest smart-city experiment and backed by Google’s parent company Alphabet. Privacy-by-design had failed. This is concerning because in response to residents’ data protection concerns, the business model for the project is explicitly built around policy mobility, where ‘solutions’ can be exported globally.
  • Ah, the tech stock ‘correction’. In late 2018 the world’s richest companies have looked like less of a sure bet for investors, mainly because of their monopolistic practices, their disregard for users’ best interests, and the fact that the EU, and possibly at last the US, are planning to regulate them in ways that are actually noticeable.

About the project

Places and populations that were previously digitally invisible are now part of a ‘data revolution’ that is being hailed as a transformative tool for human and economic development. Yet this unprecedented expansion of the power to digitally monitor, sort, and intervene is not well connected to the idea of social justice, nor is there a clear concept of how broader access to the benefits of data technologies can be achieved without amplifying misrepresentation, discrimination, and power asymmetries.

We therefore need a new framework for data justice integrating data privacy, non-discrimination, and non-use of data technologies into the same framework as positive freedoms such as representation and access to data. This project will research the lived experience of data technologies in high- and low-income countries worldwide, seeking to understand people’s basic needs with regard to these technologies. We will also seek the perspectives of civil society organisations, technology companies, and policymakers.