This was the second meeting our project has convened with European civil society organisations (CSOs) and other experts, in order to share the findings and test our conclusions from the project underway within the Global Data Justice project, based at Tilburg University in the Netherlands and funded by the European AI Fund. The purpose of the project is to map, research and report on changes in technology firms’ business strategies and areas of involvement in society, as a result of opportunities and markets created by the COVID-19 pandemic.
Our focus in this project is on the private sector’s activities during the pandemic. Specifically we are looking at technologies and corporate strategies which impact people’s ability to participate in society, from contact tracing to workplace technologies, and from border security to public sector algorithmic technologies. Some technology may be offered as pro bono services, some comes in the form of government procurement or public-private partnerships, other instances involve corporations expanding their infrastructures and capacities in ways that take them into the public sphere. The common thread we follow is technology that affects people’s agency, and thus brings up justice issues in terms of autonomy, citizenship and rights.
In this meeting we convened a smaller group of civil-society organisations and experts to feed back on and test our conclusions so far. Participating organisations included Homo Digitalis (GR), Tactical Tech (DE), Privacy International (UK) and both policy and legal experts from a number of European countries. This report outlines the discussion, but also serves as an invitation to other organisations and researchers to engage with our findings and offer their own experiences and feedback.
How has this problem evolved over the course of the pandemic?
The starting point for this meeting was an update on our work on the European sector transgressions project, which can be found here. We have been working on this project for eight months, and have assembled a database of instances where firms have shifted their activities across sectors in the context of the pandemic, with an analysis of what has happened as a result in terms of effects within different sectors and markets as well as the forms of resistance that have emerged.
During the pandemic we have seen technology firms’ strategies go through different phases. First, there was a rush to market products and services to government in response to the initial phase of the pandemic, where the declaration of emergency conditions led to particular opportunities in the form of contact tracing, healthcare supply logistics and other fields such as transitioning education at all levels to online spaces and methods. Many of these interventions were also offered as ‘pro bono services’, and thereby designed to garner goodwill, but the manner in which they did so was also secret. We are now moving into a consolidation phase where the underlying layer of pandemic technology is becoming the new normal, and people have adjusted to what was previously unusual, for instance in the case of possessing and regularly using vaccination certificates. We are also witnessing the increased consumerisation of pandemic technology, for example in the elderly care sector.
In parallel, we are seeing the outcomes of various legal and political challenges to particular incursions by technology actors, and starting to get a sense of what new deployments will become longer-term. Questions that are emerging from our work currently include:
- What part of the problem (regulatory failures and challenges to rights) relates to demand by the public sector, and what relates to increased supply of solutions? Are companies increasing the supply of solutions, governments, enterprises and citizens increasing their demands, or both?
- How can civil society organisations and regulators respond effectively to the challenges of new sectoral incursions by private-sector technology interests?
- What kind of expansion of focus on the part of CSOs, regulators and others is necessary to resist and shape these changes, given the underlying growth in the infrastructural and political power of technology firms?
The sectoral view
If we take a sectoral perspective on the ways that technology has come into play in novel and potentially problematic ways during the pandemic, we see how this problem has implications on two levels: first, for digital rights organisations working on overarching issues of surveillance, privacy and commercial technologies’ takeover of public-sector functions, and second (and just as important) for organisations working on sectoral problems such as social welfare, workplace surveillance, workers’ rights or access to healthcare. Connecting these perspectives is essential for effective action by regulators and civil society.
From the sectoral perspective we could describe this as a set of ‘supply-side problems’ – frictions and regulatory failures that result from a massive acceleration and increase in solutions marketed by firms, in particular to the public sector. Some examples from our findings (you can see more selected cases in our dataset here):
- Education: new EdTech has been introduced at every level of the educational system, without schools receiving guidance as to privacy and data protection issues for children and students. Firms are using this technological foothold and the funding streams that facilitate it to expand their involvement from facilitating online learning to student surveillance and the introduction of security-related technologies such as facial recognition into schools.
- Social benefit distribution: contactless and remote technologies are being marketed in the sphere of welfare and social benefits that place more interactions online and create new, born-digital application and distribution procedures for needs that first evolved during the pandemic (e.g. compensation for business slowdowns or support for home-schooling connectivity, software and hardware).
- Health: security firms have contracted to manage quarantine (e.g. G4S in the UK, Take Task in Poland), testing and tracing (Palantir, also in the UK), logistics and accounting firms are overseeing the distribution of vaccines, PPE and [word missing] (e.g. Deloitte overseeing testing sites in the UK), and large numbers of ID tech providers are involved in setting up and running vaccine certification and checking programs.
- Policing and security: actors such as biometrics firms, defence contractors and security firms are becoming increasingly involved in internal and external bordering activities in relation to the pandemic – certifying, identifying and controlling spaces at both national and local levels. Examples include RECO3.26 offering facial recognition for control of public spaces in Italy, mobile network providers in the Netherlands conducting crowd monitoring in public space for the public health authority, and HikVision monitoring arrivals with thermal cameras at French airports.
All these sectoral uses tend to have underlying infrastructure in common, for example Amazon’s web services software, Microsoft’s educational platform, Salesforce’s analytics platform, or Google or Apple’s hardware. These, in turn, operate based on cloud services which are run by a core of the wealthiest firms, and whose capacity makes possible the sectoral applications we see taking hold on the global scale.
Beyond the sectoral view: overarching changes
It is important to move beyond thinking of tech companies as exceptional, and as doing good. In actual fact, there is no difference between them and other companies, where the emphasis is to make profit for shareholders. Whereas amongst themselves, tech companies can have different national origins and ownership models, they are in actual effect basically capitalists with a bottom line, and therefore, when they move into new sectors, they will call this diversifying not transgressing. The pandemic has offered an opportunity for tech companies to look at other avenues to monetise data, and they are increasingly keen to not be tied to advertising, especially as regulation in this area through the Digital Markets Act and Digital Services Act in the EU have emerged.
In many parts of Europe, in normal situations, public-private partnerships would be much more complex processes, and would face greater scrutiny. However, during the pandemic, a lot of solutions which have been waiting for approvals for years were suddenly legitimised – for example in Greece, where a long-discussed biometrics law was passed without public consultation due to different legislative conditions offered by the pandemic emergency. This has resulted in a lack of regulatory action, where the emergency situation has resulted in not just a relaxation of standards but also for emergency-related contracts to exist without an end date. As a result, the increasing outsourcing of state functions to the private sector is continuing without sufficient regulatory oversight.
If we take this broader view that technology firms are not exceptional – that they are capitalists, not surveillance capitalists – this diversification sparked by the pandemic creates a much broader power base for them, along with conditions where regulating them effectively becomes more difficult. It also makes sense of the current incursions into spheres such as health, where increasing attention by regulators to advertising technologies and the monetisation of data has created an incentive to gather data in other ways, for instance through direct collection of data in relation to covid risk control, either from patients in hospitals or through the population at large; through processing data on people’s movements and activities in public space, or through gathering information on preferences and behaviour in the education sector through EdTech.
All these sectoral actions facilitate a broader acceptance of technology in the ordinary activities of public and private life, and decrease the sensitivities that lead to civil society action and, in turn, the political will to regulate. They also, in practical terms, open the door to new forms of data collection by the private sector that will have impacts over the longer term on the development of new business models, products, services, and forms of optimisation and automation.
Regulatory and civil society responses
Where regulators do push back, we see different degrees of success. In the French case, the government has been acting to legitimise these partnerships on an emergency basis using parliamentary processes to create a legal basis for whatever is being challenged. This diminishes the power of regulators vis-a-vis tech firms, because it suggests that if they establish a foothold with the public sector, legislators will then regularise it despite regulatory concerns.
In the Estonian case, we saw a successful regulatory challenge in 2020 to mobile-based location tracking where the chancellor of justice stated that using GPS data for tracking people is an act of surveillance permitted only on the grounds listed in the penal code, and that given that causing a spread of a virus is not among these grounds, such surveillance is not permitted. The chancellor also observed that mobile tracking is not the best method for checking whether people have actually observed the obligation to restrict their movement since nobody is obliged to have or carry a mobile phone at all times.
Overall, however, these responses are part of what might be described as a ‘whack-a-mole’ regulatory approach to public-private partnerships on surveillance (and other tech-enabled harms), where there is no systematic strategy for where enforcement should be focused, or how to address the problem at the infrastructural or competition level. This means that even in countries with substantial public trust in technology partnerships, such as Estonia, that trust is failing as a result of the acceleration in complexity and volume of tech during the pandemic, and the resulting dependencies on the part of the government. Regulators are not acting to rebuild that trust by taking a coherent approach and articulating their priorities.
According to meeting participants we are seeing a broad regulatory failure around both sectoral incursions by tech firms and the infrastructural power that makes them possible. From Google’s acquisition of Fitbit to Microsoft, Apple, Google and Amazon’s interrelated logistics, cloud and analytics business models, regulators and governments are encountering unprecedented tests and largely failing them. Participants asked whether the size of the companies behind these infrastructures is sustainable in terms of the rule of law, democratic accountability and state capacity.
CSOs across Europe are also identifying cases for strategic litigation. For instance the Civil Liberties Union is using different approaches from freedom of information requests to litigation before domestic and regional forums to ensure that COVID-19 apps continue to respect people’s data rights. Other organisations have engaged in building databases of pandemic technologies, and also pursued storytelling to ensure that technologies are not accepted without being sufficiently problematised in public spaces.
Addressing emerging challenges
We came out of this meeting with several questions.
Do states wish to control technology firms?
Doing so, particularly in the context of a pandemic and related pressures on the public sector, requires substantial political will. If people are less supportive of private sector intervention in the form of surveillance and monitoring, but also more used to seeing private firms take on tasks which used to be public, what does this mean for the political will to support regulation of tech firms? It is also possible that states, driven by litigation and by public pushback, do have an increased interest in regulating but are not keeping up with the pace and quantity of problematic uses of tech, or lack the rules to deal with this kind of problem?
What is the most effective framing for action?
Currently civil society and regulators are tending to approach individual problems on the sectoral level, but this does not provide enough traction on the whole problem. Participants said that it would be good to see challenges coming from more than a privacy perspective, and aiming at societal shifts in governing and in the way solutions are proposed. Privacy challenges should not be stopped, but we need to move beyond them – the claims brought in the SyRI case in the Netherlands are one good example of this.
Who should be acting to check these transitions and transgressions?
A more holistic perspective would have to bring together organisations working from different angles on the same issue. This has implications for how civil society organisations strategise about cases they want to tackle, because their logic will differ – for example bringing an antiracism group together on a problem with a privacy rights organisation will mean that challenges will take longer to build, but may ultimately be more effective.
How should action at this overarching level be organised?
One suggestion is to expand the vision from curbing sectoral transgressions to articulating a positive vision of how technology can be used better, given that it increasingly constitutes the infrastructure for work, education and everyday life. This means taking into account the way in which younger people encounter these transitions and transgressions, as a fact of life rather than a change. This means connecting with organisations formed by, and for, young people, education organisations and libraries and schools. One participant said in relation to the African continent that ‘the distinction with younger people (aged 12-30) is that tech is perceived as the only solution to large problems. So there is massive faith in tech as a way to solve problems. You can’t dismiss that reality, you have to respect it and come up with different ways to negotiate with actors in the tech sector, and with people who have agency in regard to it.’
Another is for organisations working on the national level to coordinate efforts and bring together sectoral organisations and those working from different perspectives. For this, participants noted that trust between organisations is key and that this kind of collaboration occurs when there are trusted parties to link organisations, for example professors or journalists, who work with different organisations and can help them find new partners. The drawback of this is that it takes significant time to build and coordinate these collaborations.
What do these existing trends tell us about future challenges, and how can we forecast such challenges?
Participants noted that legal institutions are being used to scrutinise what is going on, but that it would help to have CSOs mapping challenges because they are not able to do this on their own. This leads to the illusion on the part of legal bodies that they are keeping things in check. Building the informational links between regulators, legal institutions and CSOs in different countries across Europe (and possibly beyond) about this kind of challenge is also important, in order to enable local actors to frame their responses in light of larger trends.