Post 6 in the Guest Blog Series Preserving Public Values in The Automated State by Natalia Menéndez González and Spyros Syrrakos
Proportionality successfully dominates modern democratic constitutionalism. We may live in an analogue ‘age of proportionality’, however, the unprecedented digital transformation upends its foundations, rendering the ‘private’ and ‘public’ boundaries porous. Primarily, private platforms assume a quasi-regulatory role by balancing competing values in a pluralist digital space; equally, governments are relying on private technologies to facilitate access to public services, as seen in education, justice and migration. This post will unpack proportionality as a public value and legal construct, and will reflect upon the complexities of the digital age, particularly within the contexts of education, migration and justice explored within the project DigiPublicValues.
Unpacking Proportionality
Defining proportionality may seem elusive, due to its multifaceted nature. From an administrative law perspective, proportionality aims to shield individuals from excessive public power, while promoting a ‘culture of justification', thus rendering decision-makers accountable on the basis of rational substantive reasons. Proportionality is ‘at the cornerstone’ of EU law, constituting a general principle which sets clear boundaries for the division of competences between the EU and Member States, under Article 5(4) TEU. The prominence of proportionality shines through in the human rights field. It constitutes the dominant judicial adjudication technique for the resolution of human rights conflicts, and a ‘best-practice standard’, typically operationalized via a three-pronged test; a measure must be suitable to attain a legitimate goal, necessary, and not excessively burdensome to the individual (proportionality stricto sensu). This test is also adopted by the CJEU in applying article 52(1) of the EU Charter, while under the European Convention on Human Rights, almost any right may be restricted under the law when it is ‘necessary in a democratic society’ in pursuit of a legitimate aim.
Delegation of state functions to private technology providers
Consequently, proportionality, as a public value and legal principle, promotes fundamental rights protection, accountability, and justice. Yet, in the digital realm, these ideals are challenged by the privatisation of public services, in light of public authorities’ reliance on digital technologies developed or deployed by corporate actors. Crucially, the delegation of state functions to private entities entails the concrete risk of datafication of general public interest areas, which may undermine their ‘publicness’. For instance, within the education realm, the introduction of new technologies based on suitability and necessity criteria has sparked the debate on the surveillance of children, and the legality of sensitive biometric data processing, be it for remote learning, personalization, or identification and authentication purposes. Furthermore, the condemnatory decision of the Swedish Data Protection Authority regarding the trial use of facial recognition in Anderstorp Secondary School highlights the disproportionate impact of such technologies. Equally, the TOEIC scandal within the migration field illustrates the risks of blind trust over private actors’ proportionality assessments; the Home Office failed to directly examine the suitability of the US test provider’s voice recognition technology, and the validity of its assessments of cheating, thus leading to the accidental deportation of thousands of people out of the UK.
Economic considerations as trumps?
In a similar vein, the notion of effectiveness (often arising in relation to necessity) is highly questioned within the realm of digital technologies. The current discourse portraying certain technologies (i.e. facial recognition) as suitable and necessary is typically shaped by the security industry on the basis of empirical evidence on the technological advantage provided by these technologies, as opposed to analogue tools. Such practices, however, are crippled by inadequate accountability, and lack of transparency regarding the decision-making process and the justification of the final outcome. This raises a series of concerns; firstly, the prioritisation of monetary efficiency in the balancing exercise significantly expands the concept of ‘legitimate goal’, giving the impression that it should be viewed as a public value itself. This is, often, reinforced by the applicable legislative framework itself. For instance, the UK judicial digitisation strategy greatly depends on public procurement for cooperation with technology providers, as evidenced by the Money Claim Online tool. Secondly,
one may, more broadly, wonder whether proportionality has been manipulated in order to legitimise the pernicious effects of invasive technologies through the introduction of safeguards, despite the fact that the essence itself of the rights to data protection and privacy could very well be infringed in several cases.
Collective harms
Finally, the privatisation of public functions in the digital space creates distinct challenges for the contextual nature of the proportionality test, as well as for the test of time, which demands regular monitoring of the local circumstances. Digital technologies have extremely contextual deployment circumstances which change rapidly due to their disruptive nature. For instance, within the migration field, the use of such different technologies (e.g. blockchain, AI) raise diverse legal challenges, which must be individually considered in a proportionality assessment. This raises several questions. Firstly, how should this contextual analysis be conducted, and how should the intangible nature of the harm be assessed? Traditionally, proportionality focuses on the particular circumstances of the adoption of the measure, and, as mentioned, economic considerations tend to be prioritised by private actors. This individualistic and utilitarian focus seems ill-suited in the digital era, since the scalable effects of AI may lead to societal harms with a disproportionate impact on vulnerable groups, including
immigrants and refugees. In the UK, the Court of Appeal was criticised in Bridges precisely because it disregarded the collective harm stemming from live facial recognition technology in the policing context. Secondly, how can one remain in the loop when technology is constantly changing? Post-deployment monitoring seems necessary, however, the business model of ‘data maximization’ sits uneasily with the data minimization principle, a clear expression of proportionality under data protection law.
The shift from the world of atoms to the world of bits leaves us with a number of puzzles, not least because of the tremendous power private actors have assumed in the digital society. Against this backdrop, the primordial function of proportionality as a bulwark against the abuse of power can act as a beacon of light for algorithmic due process, binding both public bodies and private actors.
Bios
Natalia Menéndez González is a PhD candidate at the European University Institute,
where she researches the application of the proportionality principle to the use of Facial
Recognition Technology. She is also a Research Associate at the Centre for a Digital
Society on Data Governance.
Spyros Syrrakos is a doctoral researcher at the LSE Law School. His research explores
the interplay between proportionality and digital rights in the EU via an empirical legal
approach.
Leave A Comment