Smart law: interview with Sarah Green

As the nation embraces life online as never before, Oliver Sanders QC talks to Professor Green about her journey from coder to Commissioner for Commercial and Common Law, her ambitious programme of reform, and the implications of emerging tech on both law and society

I meet Professor Sarah Green at the Law Commission offices in Westminster in early March 2020, not long after she had formally taken up her new role as Commissioner for Commercial and Common Law (‘CoCo’). The remaining three Commissioners are responsible for criminal law (Professor Penney Lewis), property, family and trust law (Professor Nicholas Hopkins) and public law and the law in Wales (Nicholas Paines QC).

Prof Green’s primary focus will be on her own area of specialist expertise – the law relating to the digital economy and emerging technologies. She describes her key mission as, ‘Updating and modernising the law to make it more accessible, effective and relevant for the digital age’.

The importance of this was powerfully underlined soon after our meeting when the COVID-19 lockdown was introduced, forcing even more of our everyday communications and transactions onto online platforms.

Prof Green read law and then industrial relations at Oxford University in the late 1990s (Balliol College and Saïd Business School), before working in the IT industry during the ‘dotcom boom’ of the early 2000s (at Accenture Plc) and then moving into academia (Birmingham, Oxford and, most recently, Bristol Universities).

Although she now describes herself as ‘a dyed in the wool academic’, Prof Green’s early career as a software consultant (‘essentially a coder’) clearly had a formative impact on her legal interests. After completing her Masters, she proceeded on what sounded like an enormously challenging journey from an ‘exciting and glamorous’ pitch by Accenture at a graduate careers fair, to a ‘very intense’ three-week bootcamp at ‘Code School’ in Chicago and on to the construction of websites for the Sony Playstation 2.

While Prof Green says her knowledge of coding is now (quite understandably) long out of date, her experience in the industry has nevertheless left her with a solid grounding in what she calls the ‘logical architecture’ of coding and an insider’s understanding of its immense practical potential and possible legal implications. In this regard, she sets great store by the Bill Gates adage, ‘We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten. Don’t let yourself be lulled into inaction’.

Her fascination with the regulation of this area begins at ‘the intersection between property, contract and tort’ and has a particular focus on the treatment of ‘smart contracts’ (‘automated computer programmes or algorithms that can execute legally enforceable obligations without human intervention’), ‘digitised assets’ (such as cryptocurrencies and software) and artificial intelligence or ‘AI’.

These interests are well-aligned with the CoCo strand of the Law Commission’s 13th Programme of Law Reform (launched December 2017) and this no doubt informed its decision to include Prof Green in a new outreach step whereby Commissioner vacancies are drawn to the attention of possible candidates who might not otherwise consider applying.

As Commissioner for CoCo, Prof Green has assumed responsibility for her team’s ongoing work on the reform of the ‘right to manage’ in leasehold, the protection of consumer pre-payments in the event of retailer insolvency and intermediated securities. She is also keen to start on work more squarely focused on smart contracts and digitised assets and has various other tech-related topics on her ‘wish list’ as issues that the Law Commission might be asked to look at during her tenure.

By way of an example on the latter front, Prof Green raises the question of product liability in connection with malfunctioning computer software and the use of AI in the context of recruitment and disclosure exercises. She cites recent studies by the Oxford Internet Institute which have looked at the use of historical data by recruitment selection algorithms and found that it can be contaminated by and so entrench historical bias and discrimination. Establishing responsibility for such matters will obviously have important implications for data collectors, software developers, employers, job applicants and society as a whole.

A large part of our discussion centres on the recent and ground-breaking decision of the Supreme Court of Singapore (Court of Appeal) in Quoine Pte Ltd v B2C2 Ltd [2020] SGCA(I) 02. In this case, the judgment of Chief Justice Sundaresh Menon pithily begins:

‘The world of cryptocurrency trading is not for the faint-hearted. It can involve computer-generated, high frequency transactions in digital quasi-currencies (otherwise known and referred to in this judgment as “cryptocurrencies”) which manifest on computer screens or printouts but are not otherwise in a physical form…

Sometimes things can go wrong, as they did in the present case.’

Handed down the week before our meeting, the Quoine decision acts as a timely reminder of how unsettled the law is in this area and an illustration of the issues that can arise. Prof Green elaborates a number of very interesting questions around: the formation of smart contracts; the relevance of principles of agency and the actual and constructive knowledge of programmers; the enforceability of built-in automated dispute resolution processes; the application of the doctrine of mistake; remedies and the scope for rectification or novation; and the private international law implications of classification.

One legacy of Prof Green’s time in industry may well be reflected in the stress she lays on the ‘absolutely essential’ role of stakeholder engagement and input in the work of the Law Commission:

‘I am very aware that, coming into this job as an academic I want to avoid any sort of ivory tower myopia… What I really enjoy about speaking to stakeholders – although it can sometimes be frustrating if it’s not what I want them to say – is that it gives me and the team a completely different perspective. And I think what’s good about it, is that it forces you to have the humility to say “this is what I think is important, but I really need to know if it isn’t”.’

She also emphasises the ‘invaluable’ contribution of the Commission’s in-house, seconded Parliamentary Counsel (at all stages, not just in producing draft legislation) and the ‘peer review’ function of all the Commissioners sitting together with the Commission Chair (currently Sir Nicholas Green).

In addition to the above, Prof Green has initiated a Commission-wide ‘AI and emerging tech incubator’ which she hopes to integrate into its processes as a standard stage for every project. The intention is that the incubator will operate as both a resource and a forum for facilitating the consideration of possible tech developments, which may present both problems and solutions. In this regard, Prof Green stresses that AI and emerging tech have implications across every area of the Commission’s wider work. For example, the Criminal Law and Public Law teams have recently been considering cyber-crime and automated vehicles respectively and, in the CoCo context, the likely impact of quantum computing on the feasibility of ‘mining’ cryptocurrencies is already firmly on Prof Green’s radar.

Prof Green speaks very highly of the work done by her predecessor, Stephen Lewis, and the induction and handover process which has made for a ‘really positive transition’ from academia. She is technically on unpaid leave from Bristol University and so retains her Chair there, although she reflects that she is at least able to enjoy a welcome break from marking papers for the time being. She takes on a truly fascinating subject at a critical time and the law can only benefit from her evident insight, energy and enthusiasm.

Category: 
Issue: 
Author details: 
Oliver Sanders QC

Oliver Sanders QC practises from 1 Crown Office Row, London and specialises in: public law and human rights; data protection and information rights; inquests and inquiries; and security, intelligence and extremism cases.