Autonomous Vehicles, Software and Product Liability: Have the Law Commissions Missed an Opportunity?

 

 

Whether the Consumer Protection Act 1987 (CPA) applies to software is a topic of considerable uncertainty. Since the 1980s practitioners and scholars have debated whether the European product liability regime (Directive 85/374/EEC), which the CPA implements, applies to products with some software smarts—like that in automated industrial tools, aeroplanes and computers controlling nuclear plants or being used to monitor the condition of a patient in hospital. The application of the CPA to emerging technologies is similarly fraught with legal uncertainty. 

There has been much hype about the extent to which the law needs to change to accommodate artificial intelligence (AI), which can make certain decisions and evolve or ‘learn’ independently of humans, but the question of whether the CPA applies to software remains underexplored. As of June 2019, when the English and Scottish Law Commissions published their analysis of responses to their joint preliminary consultation paper in their project on automated vehicles, the chance of this issue being investigated and resolved soon has greatly diminished. Even though automated vehicles are an ideal case study in why the CPA’s definition of ‘product’ has the potential to produce arbitrary outcomes—and despite sixty-one percent of respondents to the Law Commissions’ preliminary consultation paper advocating for an investigation of how that definition applies to software—the Law Commissions declined to include this issue within the scope of their project. In this article, I explore why this question has arisen in the context of automated vehicles, the source of this uncertainty and why I think the Law Commissions have missed an opportunity in failing to consider this question in their ongoing project on automated vehicles.

Why has the question reared its head again?

The question of whether the CPA applies to software has arisen because of the Automated and Electric Vehicles Act 2018 (‘AEVA’). No date has been announced for commencement of its substantive provisions. However, once in force, the AEVA will create a regime under which the insurer of an automated vehicle will be liable for accidents caused by the automated vehicle while it is ‘driving itself’ on a road or other public place. Insurers may recover, in turn, from any other person liable to the injured party in respect of an accident. As the Explanatory Note to the AEVA explains, insurers may recover ‘under existing common and product law’ (para 12).

Claiming in product law against the producer of an automated vehicle, or a component of one, is likely to be the most obvious option. A CPA claim has some significant advantages over the common law. For a start, the CPA imposes a regime of strict liability. This regime overcomes considerable difficulties presented by AI-driven technologies for negligence law by alleviating the need for a claimant to evidence fault, which would be difficult because what circumstances will be ‘foreseeable’ to the software producer, and what will be ‘reasonable’ on the part of software producers, are both highly uncertain. The CPA also overcomes the doctrine of privity which prevents, for example, pedestrians from claiming damages for breach of contract against the producer of an automated vehicle.

Against this background, the Law Commissions of England and Scotland released a joint preliminary consultation paper in November 2018. This paper marked the start of the Law Commissions’ project on automated vehicles and aimed to help focus its inquiry. For our purposes, the most interesting question the Law Commissions posed was: ‘Is there a need to review the way in which product liability under the Consumer Protection Act 1987 applies to defective software installed into automated vehicles?’

Sixty-one percent (53 of 87) respondents to the Law Commissions’ paper answered that there was a need to undertake this review. But the Law Commissions concluded that they will not be doing so, primarily because any review of the application of the CPA to software should be done generally, not simply for automated vehicles.

The Law Commissions nevertheless expressed ‘hope’ that the Government would consider such a review following the report of the European Commission’s Expert Group on Liability and New Technologies. That Expert Group was due to report mid-2019 but has not met since May and has not yet published its report.

The source of uncertainty

To take a step back: the CPA applies to ‘products’. It defines ‘products’ as ‘any goods or electricity’, including products which are ‘comprised in another product’, such as component parts. It is generally accepted that software is not ‘electricity’, even if the result of software operating is the presence or absence of electricity. Whether software is a ‘good’ under the CPA is, by contrast, still open for debate. The question has not been tested in the courts of England and Wales, nor Scotland (nor Northern Ireland, although the provisions applicable there are a clone of the CPA rather than the CPA itself).

The courts have considered whether software is a ‘good’ in other areas of law, including sale of goods provisions and for the purpose of a common law possessory lien. In a line of authority emanating from obiter dicta of Sir Iain Glidewell in St Albans City and DC v International Computers Ltd [1996] 4 All ER 481 (CA), the courts have held that software is not a ‘good’ because it is intangible while ‘good’ denotes a tangible ‘thing’. Correspondingly, the courts have concluded that where software is contained on hardware, the software and hardware combination is a ‘good’.

In the most recent case in this line, the Supreme Court has referred the question of whether software is a ‘good’ under the Commercial Agents (Council Directive) Regulations 1993 to the European Court of Justice in an appeal from the High Court’s decision in Computer Associates (UK) Ltd v The Software Incubator Ltd [2018] EWCA Civ 518. In Computer Associates, the Court of Appeal concluded that software was not a ‘good’, referring to the St Albans line of authority. This case has, so far, served to reduce further the likelihood of the courts finding that software is a ‘product’ under the CPA.

A missed opportunity 

The Law Commissions’ decision not to include a review of this aspect of the CPA in its current project on automated vehicles is no doubt disappointing to the sixty-one percent of respondents to the joint preliminary consultation paper who advocated for it doing so. Many of that group were insurers and legal advisors.

Whether software is a ‘good’ is of great significance. After all, software developers who supply software with hardware are producers of a ‘product’ and are exposed to liability under the CPA if a defective product they supply causes damage. Those who merely supply software are not so exposed. Whether software is a ‘good’ is, therefore, a vital threshold question for producers of software, producers of products in which software is embedded, those using such products and those living alongside them.

Autonomous vehicles illustrate the significance of this question. The automated driving system (ADS) which enables self-driving features of a vehicle is made up of hardware and software. Yet, curiously, the way in which these systems are supplied and integrated into the vehicle determines whether the software component is a ‘product’ under the CPA. For example, if the software is supplied by upload/download, it is not a ‘product’ but it is if supplied on a hard disk. 

This difference seems unprincipled, given that the same software ends up in the vehicle, regardless of the format in which it is transferred between the developer and others. Clearly, this is also unsatisfactory for claimants. The same injury caused by the same software defect will, in one instance, be compensable under the CPA while in the other instance not, merely because of the way the software was supplied.

The uncertainty is amplified when updates or enhancements to existing hardware or software are pushed over-the-air to an automated vehicle. Most software updates take place in this way nowadays but are probably outside the scope of the CPA because no ‘product’ is being supplied.

This conundrum is unsurprising given the ages of the CPA and Directive. However, other jurisdictions have moved more quickly than the UK without seeming to have halted innovation or imposed too great of a burden on the software industry. In the US, the courts have held that mass-produced software, that is software which is ‘off-the-shelf’ and not custom-made, can be a ‘good’ for the product liability provisions of the Restatement (Third) of Torts. Their reasoning is that the policy underpinning the imposition of strict liability on manufacturers applies equally to producers of off-the-shelf software. Similarly, the Australian legislature has amended the definition of ‘goods’ for the purposes of Australian product liability laws (which are based on the Directive) to expressly include ‘computer software’. It seems that the UK’s laws are, in this respect, not only failing to keep abreast of technological change but are trailing legal developments in other jurisdictions.

Where now?

Whether software is expressly brought within the definition of a ‘good’ under the CPA or a sui generis category created for it akin to ‘digital content’ under the Consumer Rights Act 2015 (see here), the Law Commissions have missed an obvious and critical opportunity to contribute to the discussion.

Examining how software is treated under the UK’s product liability regime in the context of automated vehicles would have provided an excellent case study by which to explore issues that arise for a myriad of technologies in which software is embedded. Such an examination would also have contributed to the valuable and urgent work called for by the House of Lords Select Committee on Artificial Intelligence. It must be remembered that the Law Commissions’ project on automated vehicles takes place against the backdrop of the Select Committee’s call for work to be done ‘as soon as possible’ to clarify ‘whether new mechanisms for legal liability and redress’ are needed to cope with situations where systems utilising AI malfunction .