Friday 26 January 2024

If you like this ...

I was hoping that someone else would by now have critically analysed the recent High Court judgment in Emotional Perception AI Ltd [2023] EWHC 2948 (Ch), which issued on 21 November 2023, but it appears that nobody has yet. There has been a recent article in the CIPA Journal, but the less said about that the better. Even the IPKat has said nothing about it so far, which is a pity. It appears therefore to fall to me to do the necessary explanation and (to give the game away somewhat) point out that the judgment is a definite outlier and is not in line with higher level case law, including several judgments from the Court of Appeal, nor is it in line with case law at the EPO. There have been several articles published that have, rather excitedly, announced the judgment as some kind of breakthrough for AI inventions because it finds, in effect, that a trained artificial neural network (ANN) is not a program for a computer under section 1(2) of the Patents Act 1977. This is, of course, in reality complete nonsense. As a consequence, the reasoning goes, this will result in AI inventions now being much more patentable than they were before. The UK IPO have even changed their practice to instruct their examiners that, as from 29 November 2023, objections should not even be raised to inventions involving an ANN for excluded subject matter. This is great news for inventors working in the field of AI who want patent protection (which may not be all of them), and also great news for their patent attorneys. I, however, am not so sure it will work out well in the longer run, and suspect this will be a temporary aberration, although I may of course be wrong.

I have been following case law on patentability in the UK and at the EPO for the past 17 years or so, coincidentally starting roughly around the time of the Court of Appeal judgment in Aerotel/Macrossan [2006] EWCA Civ 1371, which issued while I was sitting my UK Finals in 2006. Looking at my IPKat posts from around that time (see here for example), there was much discussion in the follow-up to Aerotel about whether the 4-step test was actually in line with the EPO, which instead applied the problem-solution approach according to Comvik (T 641/00). After a bit of disagreement about the validity of computer program claims (see here), the issues in the UK appeared to settle. Although the tests applied at the UK IPO and the EPO are very different, the core issues are essentially the same, which is whether there is a 'technical effect' (which I wrote about in 2013 here). I have revisited this several times in the intervening years, most recently when writing a chapter for the IPKat's 20th anniversary book. Although there has been some tinkering around the edges, the general principles have not really changed for at least 10 years and, for better or worse, have been applied with reasonable consistency by the UK IPO. These principles can be found in any of the numerous decisions that have come from UK IPO hearing officers, which I have had the dubious pleasure of reviewing for the CIPA journal for the past 17 years. You may even have read one or two of my reviews, although I suspect very few people do.

Firstly, according to Aerotel/Macrossan, the way to deal with 'excluded matter' under section 1(2) (i.e. things that are not inventions for the purposes of the Act), was to: 

i) properly construe the claim; 

ii) identify the contribution (which may be the actual or alleged contribution, depending on whether a search has been performed); 

iii) ask whether the contribution falls solely within excluded matter; and 

iv) if step iii) hasn't already covered it, check whether the contribution is actually technical. 

Secondly, in considering whether a computer program makes a technical contribution, the later decision in AT&T/CVON, which was supported and slightly amended by the Court of Appeal judgment in HTC v Apple [2013] EWCA Civ 451, set out the following five 'signposts', any one of which may indicate the presence of a technical contribution according to step iv) of the Aerotel test:

i. Whether the claimed technical effect has a technical effect on a process which is carried on outside the computer. 

ii. Whether the claimed technical effect operates at the level of the architecture of the computer; that is to say whether the effect is produced irrespective of the data being processed or the applications being run. 

iii. Whether the claimed technical effect results in the computer being made to operate in a new way. 

iv. Whether the program makes the computer a better computer in the sense of running more efficiently and effectively as a computer. 

v. Whether the perceived problem is overcome by the claimed invention as opposed to merely being circumvented.

It is worth noting at this point that, although Kitchin LJ stated, "these are useful signposts [...] But that does not mean to say they will be determinative in every case", in practice at the UK IPO, rightly or wrongly, these signposts have been determinative in every case where they have been applied. There has not been a single case where the signposts have been used and yet the invention has still been found to have a technical contribution. I would certainly have noticed and pointed it out with great excitement if there had been. To at least a first approximation therefore, if the contribution in a computer-implemented invention is not found to be within at least one of the signposts, it is not patentable.

With that all established, a computer-implemented method in a patent application such as the one put forward by Emotional Perception AI Limited (previously known as Mashtraxx Limited), would be expected to face a difficult time in getting granted. The application itself related to a method of training and implementing an ANN to identify a pair of similar data files, a particular example being music files. In simple terms, the claimed invention was about matching data files based on closeness of written descriptions and of the data itself. The actual claimed invention is a classic example of technical obfuscation, with the latest version of claim 1 reading as follows:

The IPO examiner objected that the claimed invention was not patentable because it related to a mathematical method and a computer program as such, objecting that the contribution made by the claimed invention (leaving aside clarity and sufficiency issues with claim 1) was to software for training an ANN to identify similar files by extracting measurable signal qualities from two files, assembling a multidimensional feature vector for each file based on these qualities, determining the distance between the two vectors and adjusting the ANN's weights to provide a measure of similarity between the files. In effect, the invention (in my words, not the examiner's) was about analysing music files to provide an output of the type "if you like this, then you might also like this". The examiner found that this did not solve a technical problem, and especially not a technical problem within a computer. The invention was computer-implemented, but used conventional hardware that was programmed to perform a non-technical function. It was not directed to a process outside of the computer, nor did it form part of the internal workings of the computer. In conclusion, the examiner found that the invention was "directed to an excluded process and there is nothing more to it" (examination report 13/10/21, page 10/11). 

After reaching this impasse with the examiner, the applicant then went for a hearing before the (very experienced) hearing officer Phil Thorpe. Among the arguments run by the applicant, one involved a comparison with the seminal EPO Board of Appeal decision from 1986 of Vicom (T 208/84), which decided that an image processing algorithm could be technical because the output was an improved image, which was considered to be technical. The hearing officer did not, however, consider that the comparison was a good one, because in this case there was no change to the data files, but only an output that provided an indication of a pair of semantically similar files. The claimed invention, in the hearing officer's view, did not therefore define a technical process and the contribution was found to relate wholly to a computer program as such, resulting in the application being refused. 

At that point, I would expect any normal applicant to give up and take the loss, given the very low hit rate of getting computer-implemented inventions facing excluded matter objections granted at the UK IPO. In this case, however, the applicant and their attorney did not give up and instead took the matter to appeal before Sir Anthony Mann in the High Court. Although obviously a highly experienced legal mind, Sir Anthony has not come to my attention at all previously in the area of patent cases, let alone the highly specialised area of patentability of computer-implemented inventions. As it turns out, this is I think one of the key problems with the judgment. 

Sir Anthony first set out how he understood the invention, which he considered could be envisaged as a 'black box' "which is capable of being trained as how to process an input, learning by that training process, holding that learning within itself and then processing that input in a way derived from that training and learning" (paragraph 3). He then described the claimed invention as being an improved system for providing media file recommendations to an end user, particularly for music files, the advantage of which was to offer suggestions of similar music in terms of human perception and emotion by passing music through a trained ANN. Claim 1, which had been (rather confusingly) amended from the above claim 1 to a "system for providing semantically relevant file recommendations" was defined by Sir Anthony as "a product by process claim", while another independent claim defined a corresponding method, both defining the steps of training the ANN and providing an output of relevant files. My initial problem at this point is that claim 1 as presented in the judgment (see here) is not really a product by process claim, but is a system defined by its features of operation, which is not in line with how a typical product by process claim is defined (see for example the EPO Guidelines F-IV 4.12). 

My next problem with the judgment is Sir Anthony then going on to "assume for the moment that the ANN itself is a hardware system (as opposed to a software emulation)" (paragraph 8). Why this assumption is made is completely beyond me. There is no justification for it from the application, nor is there any sensible reason why the ANN would be assumed to be hardware. In the Court of Appeal judgment of Gale's Application [1991] RPC 305, which is supposed to be in line with the current Aerotel test, Aldous LJ found that putting instructions on hardware did not make an invention patentable. The invention in that case related to a new way of calculating a square root on a computer. Aldous LJ found that "if Mr. Gale's discovery or method or program were embodied in a floppy disc (software) neither the disc nor a computer into whose RAM that programs had been inserted could be patented, it must, in my view, follows that the silicon chip with its circuitry embodying the program (hardware) cannot be patented either" (para 332). Simply switching from software to hardware did not therefore, based on this reasoning, make an invention patentable if it did the same thing. Sir Anthony, however (who did not appear to be aware of this judgment as he made no reference to it, nor was it pointed out to him by the UK IPO) managed to change the argument about whether the invention in this case was patentable by arbitrarily assuming that the ANN was implemented on hardware and entirely ignoring the valid line of reasoning that this should make no difference. 

The next problem I have with the judgment is another apparent misunderstanding by Sir Anthony, apparently led by the applicant's representatives, where he came to the view that the ANN training model was not programmed to include all its detailed logical steps but adjusted itself through training to produce a model which satisfied the training objective. This led him to ask the question of where the computer program was that was said to engage the exclusion. This resulted in a confused bit of reasoning about where the program would be in the case of a hardware ANN (which had already been wrongly assumed), with the UK IPO's representative implausibly conceding that "there would in that case be no program to which the exclusion applies" (para 43). This inevitably resulted in Sir Anthony, having been well and truly led up the garden path by the applicant's attorney and the IPO's problematic representation, that there was in fact no computer program because the ANN was not operating on a set of program instructions at all but "was emulating a piece of hardware which had physical nodes and layers, and was no more operating or applying a program than a hardware system was". It becomes very difficult at this point to take the judgment seriously any more because the reasoning is so preposterous and wrong that anything further is bound to be wrong. And so it turned out to be, with Sir Anthony concluding that the ANN in substance operated at a different level from the underlying software on the computer and operated in the same way as a hardware ANN, resulting in the emulated ANN not being a program for a computer and therefore not excluded. 

Although it should not be necessary to point out to anyone reading this with any knowledge of computers at all, the single major flaw in the reasoning of this judgment is that the ANN, whether in hardware or software, would in actual fact be defined by software in the form of computer code defining connections and weights of the ANN and how it operates. It should go without saying that computer software does not need to be written by a human being for it to be a computer program. Indeed, all computer software in the form it is ultimately used, i.e. object code, is not human readable at all but is a string of 1s and 0s that is only readable by a computer. It is still, however, a program for a computer within the meaning of section 1(2) or of Article 52(2) EPC. How the computer program is generated, whether this is by compilation of human-written source code or the result of a training process (or, in this case, a combination of both), is not relevant to whether it is considered a program for a computer. It is also not relevant, according at least to Gale's Application, whether the program is implemented on hardware instead of software. Both are defined by code that defines how hardware operates, either by programming a general purpose piece of hardware or by defining a specific arrangement created or burned into circuit form in hardware.

As if the judgment wasn't wrong enough at that point, Sir Anthony then went on to decide whether there would be a technical effect to the invention, having nevertheless already decided that it wasn't excluded. After going through some of the usual case law on the subject, including the somewhat dubious decision in Protecting Kids, Sir Anthony went on to agree with the applicant that "moving data outside the computer system in the form of the file that is transferred [...] provides an external (outside world) effect". This is, of course, entirely out of line with all existing case law on the subject, both in the UK and at the EPO. Sir Anthony nevertheless went on to find that there was a technical effect in providing an (entirely unchanged) file selection to a user as an output of the claimed invention, and allowed the appeal. 

At this point, the only comment I have to add is the following:


  1. I am glad to see that this made someone else facepalm!

    I believe the meaning of ‘hardware’ in Gale’s Application is different to that in Emotional AI. In Gale’s Application, ‘hardware’ is used to describe a read-only memory storing a sequence of instructions for a programmable processor; whereas, in Emotional AI, I believe ‘hardware’ is used to refer to devices that do not store a sequence of instructions for a programmable processor. According to the ‘factual framework within which the provisions fall’ set out in Gale’s Application, the ‘hardware’ imagined in Emotional AI would always avoid the exclusion because there is no sequence of instructions (there is no program) and there is no memory storing instructions for a programmable processor (there is no computer). Worryingly, Emotional AI suggests that such hardware does not always avoid the exclusion. I wrote about this here …

    1. Thank you. Good to see someone else has given this some thought.

    2. Any hardware ANN is still a sequence of instructions which could be programmed into a computer, surely?

    3. That's the simple point from Gale that I was trying to make, which is why I didn't understand that the assumption the judge was making.

    4. An ANN might conceivably be implemented using analogue amplifiers. In contrast to the ‘hardware’ in Gale’s Application, an ANN implemented using analogue amplifiers would not include a memory storing a sequence of instructions for a processor. The analogue circuit is not itself a sequence of instructions that could be programmed into a computer, though the circuit could be emulated by a computer. The distinction is analogous to that between a record player and an mp3 player.

    5. I wrote a bit about 'hardware' and its relation to this case.

  2. "The UK IPO have even changed their practice to instruct their examiners that, as from 29 November 2023, objections should not even be raised to inventions involving an ANN for excluded subject matter"

    This seems wrong, regardless of whether the judgment is misguided. The invention could "involve" an ANN but still have a "contribution" as defined by Aerotel/Macrossan which is excluded based on the signposts from AT&T.

    1. I think they were too quick to change their practice, especially given that they have been granted permission to appeal. The judgment should therefore be considered suspended until the Court of Appeal review it.