Prof. Schafer, your job description reads “Professor of Computational Legal Theory” – asking naively: do computers need a different legal theory? Asking less naively: What is it exactly that you do?
An excellent question, and one day I may be able to give you an answer (laughs)…
When I decided on the name for my chair, I wanted to combine the two aspects of my work that are the most important for me: The use of computer technology within the justice system (legal informatics, Rechtsinformatik), and the regulation of technology through the law (IT law, Informationsrecht). I do not see these as distinct, rather, they overlap substantially, and it is this overlap that interests me most. Both require jurisprudential reflection, and legal theory is for me also what ties them together. Jurisprudence asks two age-old questions. The first is the nature of law and legal knowledge: what do we really know when we “know the law?” The second is the normative question that asks what we can expect from a good justice system, what is, normatively, a just society, and what does it mean to live life lawfully? Both the descriptive and the evaluative lens are needed when building legal technology. Every legal technology application amplifies some aspects of justice and silences others. Our theory of the nature of law, and our ideal of a just society, are therefore also the benchmarks against which we should evaluate any (legal) technology. Without knowing what we mean with “law”, we cannot judge if a given legal technology gives the “right” answer, and without a vision of a just society, we can’t evaluate if its use is more beneficial than harmful.
Lawrence Lessig famously wrote in the late 1990s that “code is law”, you use the wording “code as law”. What is the difference between these approaches?
My concern with “ontologising” the connection between code and law the way Lessig’s formulation (probably unintentionally) does, is twofold. First, it makes it look as if the use of code for the purpose of constraining online behaviour is an inevitability, almost like a law of nature. One day we “discovered” it, like gravity. Second, it implies that there is no “loss” when we move from traditional legal rules to regulation through software architecture – they are after all identical. By contrast, “code as law” emphasises for me that there are always choices involved. Code, (in Searle’s terminology) “counts as” law only if we decide that it should do so. This may be a good choice in some contexts and for some purposes, a very bad one indeed for other contexts and other purposes. Nor are Law and Code identical, rather, their relation is more akin to that of a translation. We can always, and trivially, translate law into code, but with every translation, some meaning of the original gets lost, and the translators add nuances of their own. Analysing exactly what gets lost, and making informed decisions when and under what conditions, we can sometimes treat code as if it were law despite all the differences, is for me one of the tasks of a computational legal theory.
Which research question are you tackling while you are with us? How do you like the working environment here at REWI Uni Graz?
Let’s start with the second part of the question: the working environment in Graz is perfect for anyone who wants to tackle the societal challenges of cutting-edge technologies from a multidisciplinary perspective that systematically foregrounds the human and the human experience. It is difficult to overstate how important this type of research is at this point in time. If we look for instance at the high level “AI Summit” in the UK, we hear a lot of talk about “ethical” or “trustworthy” AI. It is in equal parts disheartening and worrying to see how the problem of “trustworthiness” and “AI for the common good” is reduced to narrow engineering questions, limiting our thinking about negative impacts of the technologies to issues that are in principle solvable through “slightly less bad” computing techniques. Between one-dimensional techno-solutionism and equally misguided fearmongering about a future “singularity”, it is more important than ever that the humanities provide voices that are technologically informed, rigorous and nuanced in their analysis.
Graz has a track record of generating this type of excellent, interdisciplinary research with impact both nationally and internationally, for instance in the Human Factor in Digital Transformation network and now also the IDea_Lab. Their structure and mission are in many ways similar to those we have been building in Edinburgh, in my own SCRIPT Centre and also now in the Edinburgh Futures Institute. Despite these structural similarities, the mix of subjects and perspectives in Graz is truly unique, which opens up many avenues for synergies and mutual learning. Unique is also its geography, and even though we may think that physical geographies have become less relevant in the digital age, this, I would argue, is far from true. Graz is located at the historical intersection of countries, cultures and communities, benefitting intellectually from this diversity of experiences and voices, but also ideally placed to play a positive role for the region and beyond.
Regarding my own research, my stay in Graz is generously supported by an Austrian Standards Fellowship, so unsurprisingly, my work touches in several ways on the issue of technical standards in AI regulation. Legally mandated technical standards have become an important part of the regulatory toolkit, most recently in the proposed EU AI Act. Despite their central role for technology regulation, they are also a bit of a blind spot in legal education and legal research. We do not normally teach students how to read, interpret and reason with technical standards. And there is almost no jurisprudential or legal theoretical reflection regarding their nature and use in the justice system. Graz has been at the forefront addressing this gap, and I want to look into 2 fields in particular: automated legal translation, and digital ownership technologies.
Regarding the former, I have been interested for quite some time in the question of legal translation, also through my work with the Scottish Law Society and recently also as a member of an EU COST Action on “Language in the Human Machine Era” (LITHME). Automated legal translation will have a more profound impact on the justice system than many other, more headline-grabbing legal technologies. Its impact will be felt in particular in countries such as Scotland and Austria that have more than one official language, and also have highly diverse immigrant communities that often face additional hurdles regarding their access to justice. There is an interesting Scottish-Austrian connection here, too, through Eleanor of Scotland (for the full details though, you will have to come to my Lunch Lecture, as not all of it is suitable for younger readers…). There has been longstanding and highly successful collaboration between Edinburgh’s SCRIPT Centre and the Fachbereich Recht und IT in Graz on a set of European standards for the ethical use of legal technologies, facilitated by AI4People, and this part of the project will build on and deepen it. There are few standards for machine translation, and none by the European standardisation organisations. To ensure that this technology achieves its potential in widening access to justice while protecting the rule of law, it is crucial that not only computer scientists, but also experts from the humanities and law are involved in their development.
Regarding the latter, the Law Commission for England and Wales has published this year its white paper on the reform of the law of ownership of digital assets. I authored at the time a response to the consultation on behalf of our DeCaDe centre, focussing on the proposed treatment of NFTs. In its proposal, the Commission essentially takes a private standard, Ethereum’s ERC-721, and imbues it with additional legal significance. Should the proposal go ahead, Scotland will have to decide if we follow the approach taken in England, or if our very different property law requires another solution. Scotland has a mixed legal system, and Scots property law in particular has been influenced more by Roman law and the continental legal tradition than the common law of England. My stay in Graz will triangulate research collaboration I have started with the University of Grenoble Alpes, and together with them and colleagues in Graz I want to explore how civilian jurisdictions such as France, Austria and Germany are responding to the challenges of blockchain technology and NFTs, with a view of also informing the discussion in Scotland.
What would be your advice to a young researcher starting in the field of AI regulation?
Avoid all the mistakes I have made, of course! More seriously, this is a difficult question for me to answer, also because the career structure and employment landscape is so different in the UK and continental Europe – though this might be my first advice: travel, if you can, and spend at least some time in a foreign academic environment. In my experience, nothing helps more to perform a “cognitive Gestalt-switch” and to revisit old certainties from new perspectives, a crucial element of research. Consider also the world outside the beaten track – the voices of the US and the larger EU member states are already heard loud and clear, the greatest added value may come from working in those countries, traditions and groups whose voices are often drowned out by them. As I said earlier, geographies matter, and Graz is uniquely placed, and uniquely connected, to facilitate such an experience.
The second advice would be: don’t tie yourself too closely to one specific technology. I’m a veteran of at least one AI winter – while there was massive enthusiasm for “AI” while I was a student, by the time I entered the labour market the term had become so toxic that not only funding and jobs dried up, but entire research groups dropped the label. I am particularly concerned when I see positions for Early Career Researchers advertised as e.g. “The law of smart contracts”, “blockchain and the law” or “the law of Large language models”. By the time the project is finished, the field may well have moved on. Technologies change, and change rapidly. But humans change at a much slower pace, and that also means that the problems that technologies create are more often than not but new variants of age-old concerns, as old as humankind itself. Law represents a collective memory of conflict resolution strategies that we neglect at our peril. One of the unique features of Graz is to group the chair for IT law with the foundational subjects, including legal history and legal philosophy. This gives a breadth of perspective that is difficult to achieve when technology is seen exclusively through the lens of commercial, data protection or criminal law. Analysing digital transformations in the context of foundational legal subjects is the best guarantee for technology law research that is future proof and passes the test of time. What is needed then is a difficult balancing act: develop solid and detailed competence in your legal specialism of choice, which will act as a second leg if “technology law” as a subject were to shrink at some point again, but also cultivate the ability to step back from these minutiae and reassess the legal responses to technology from the perspective of the big historical and philosophical narratives.
The third would be to cultivate skills that are also central to the legal profession, and here in particular the ability to listen attentively, sympathetically, respectfully and yet critically to what others are saying. Just as we elicit data from a distraught client who does not understand what information the law requires, analyse critically the argument opposing council made (and not the one we wished they had made) for weaknesses, or try to discern the sub-text of a judge’s ruling, we have to learn to listen attentively to what engineers and scientists have to say when we engage in interdisciplinary research with them. And we have also to listen with respect and attentiveness to those people and groups affected by these technology – their lived experience is likely to be very different from yours, good advocacy in legal advice and in research, always remains sensitive to this.
The fourth, and most important one: have fun and be bold! Develop a sense of playfulness when dealing with technologies, experiment with them and try which ones speak to you the most. Then think about the future you would like to see, for yourself and the generations to come, and ask how the law can help us creating these utopias that allow human flourishing.