Close this search box.

MyData’s Nordic model for data governance, a libertarian fantasy worth engaging with

Picture of Linnet Taylor

Linnet Taylor

Linnet Taylor is Assistant Professor of Data Ethics, Law and Policy at the Tilburg Institute for Law, Technology, and Society (TILT). She was previously a Marie Curie research fellow in the University of Amsterdam’s International Development faculty, with the Governance and Inclusive Development group. Her research focuses on the use of new types of digital data in research and policymaking around issues of development, urban planning and mobility. She was a postdoctoral researcher at the Oxford Internet Institute, and studied a DPhil in International Development at the Institute of Development Studies, University of Sussex. Her work focuses on data justice – the development of a framework for the ethical and beneficial governance of data technologies across different regions and perspectives.

One of the central puzzles for our research on global data justice is what ‘global’ can mean when we talk about governing data technologies. So far we barely have governance approaches that are international, let alone global, and we don’t have much idea of what we might want from a global approach if it existed.

Why might we want such an approach? Well, it’s clear that technologies built in one place get used in many others, but that they play out very differently in those different places. WhatsApp can serve to link friends in a country at peace, but in an area of ethnic or religious conflict the same technology can provide a way to set fire to political tinder, fuelling violence and persecution. Database technologies that are innocuous administrative tools in one place can be used to marginalise and exclude in others. And most importantly, the data market is global. When a developer builds an app, she ends up collecting data from users all over the world – and usually also selling that data to brokers such as Google, Facebook and Acxiom.

Given that the collection of data, and the market for it, are now global, what might we want to do about it to protect people from the adverse effects of the work their data doubles do around the world? As a way to get a handle on this question, we are seeking out groups who make a statement about how data should be governed, and working to understand what implications their governance models have on a broader level.

One thread to follow in this area is the MyData group. Conceptualised in Finland as a model for people to capture and share the data that is created about them, this has grown into a proposed ‘Nordic model’ for data management, and has gradually spawned groups around the world interested in the claims the group is making about how to use and share data. The initiative has three main tenets: that people should be able to manage their own data and privacy, that data channels should be standardised so people can access ‘their’ data (in the process making new business models possible), and that everyone should be able to move their data from one provider to another without penalty. All this is built on the principle of data portability that is central to Europe’s new GDPR.

MyData is interesting because it is turning from an idea into a movement, and gradually globalising. As non-Finns soon realise when they take part in the discussion, the model is based on Finnish principles of trust amongst individuals, service providers and the state. It’s reliant on a clear idea of what is personal data (which, as Nadya Purtova has pointed out, is problematic), on the idea that ‘control’ of your data and privacy is actually possible (which it is, with the caveat that many other bodies will simultaneously control them too in ways you have no influence over), and on the notion that the GDPR’s principle of portability can be enforced even where it goes against the commercial interests of tech firms. Outside the context of the trusting relationship between Nordic citizens, their states and their digital providers, it looks like a libertarian fantasy where individuals take on the power to regulate and control how their digital selves are used and transacted.

So what happens when that is tried elsewhere? Can the Nordic model be transferred, scaled or reworked to fit other social and political environments?

As MyData grows and evolves, other groups around the world are expressing interest in its principles. Portability may be the easiest fundamental principle to apply in new places, given that it can be translated in different ways and to different extents. Importantly, the MyData principles are not predicated on all data becoming portable: for instance the movement does not have an explicit position on data derivatives such as profiles and inferences. Instead, MyData focuses on primary streams of data that can be identified by people themselves, and duplicated wholesale into other channels. Examples are commercial transaction records from particular services, administrative data collected by government bodies, quantified self data, and health records.

One practical question that arises if MyData internationalises is that of the technical infrastructure to actually do something with any data that is made portable. The model is predicated, to some extent, on parallel innovations that make it possible for people to direct and control what happens to (copies of) data about them. No one wants to invest the time it would take to actually manage all the data produced about themselves – nor is this a challenge anyone has had to meet yet, given that the entire data brokerage industry exists to do just this.

The chief question MyData brings up for a researcher thinking about social justice is ‘what is data’? There are various definitions, which depend on political and institutional perspectives. In the MyData model, which inherits the ontology of innovation and business, digital data is an asset which can be manipulated, transferred and traded. Seen from a social perspective, however, it may be other things. Christian Fuchs, among others, addresses digital data as labour, Lina Dencik works on the idea that digital data is social relations, and deserves to be treated as an extension of people themselves. Meanwhile, the philosopher of information ethics Luciano Floridi states that ‘“my” in “my data” is not the same “my” as in “my car”, it is the same “my” as in “my hand”, because personal information plays a constitutive role of who I am and can become.’

These are, to differing degrees, incompatible with the MyData principles. However, these incompatibilities also offer a way to evaluate different claims. If we see pushback around the world in relation to particular claims about how people can own or manage data, we can use this to understand what kind of fundamental principles for data governance might be acceptable in different places. So we should take heed of the fact that groups around the world are interested in MyData: it suggests that there is a perceived need for transnational approaches to governance; that people may be noticing the opacity of the data market, and that they would like alternate arrangements.

MyData will inevitably gain traction with the community of innovators worldwide who are interested in building alternatives to the tech-giant economy. It will also be attractive to people who are aware of the work their data doubles are doing, and who have the technical capacity and the leisure to engage in managing them. From a social justice perspective, however, the most interesting outcome will be if discussion and adoption of the model raises the question of what we cannot see: the invisible labour our data doubles are doing, and how they are being aggregated across national boundaries. In turn, this leads to some fundamental questions about how we want to govern data – questions than cannot be answered by the current state of knowledge.

About the project

Places and populations that were previously digitally invisible are now part of a ‘data revolution’ that is being hailed as a transformative tool for human and economic development. Yet this unprecedented expansion of the power to digitally monitor, sort, and intervene is not well connected to the idea of social justice, nor is there a clear concept of how broader access to the benefits of data technologies can be achieved without amplifying misrepresentation, discrimination, and power asymmetries.

We therefore need a new framework for data justice integrating data privacy, non-discrimination, and non-use of data technologies into the same framework as positive freedoms such as representation and access to data. This project will research the lived experience of data technologies in high- and low-income countries worldwide, seeking to understand people’s basic needs with regard to these technologies. We will also seek the perspectives of civil society organisations, technology companies, and policymakers.