While there is nothing much that is going on in European and UK patent law to interest me sufficiently to write a blog post about, here is something else that has been interesting me for a while. Those of you who follow me on Twitter will probably already know what this is about. Others can read on and find out for themselves.
One of the things that impressed me most while I was training to be a patent attorney I found from Paul Cole's book "Fundamentals of Patent Drafting" (which can be ordered from CIPA here). On page 2 of the introduction is a footnote referencing an article titled "Cargo Cult Science" by the physicist Richard Feynman. Paul Cole identified this as being required reading, so I duly went away and bought the book, although it turns out the article in question is also freely available on the internet (here). The book is certainly worth buying anyway, because it is full of all sorts of strange and funny stories from Feynman's life, including how he picked the safes holding the secrets of the atomic bomb while working on the Manhattan project.
The cargo cult science article, which was derived from an address Feynman gave to students at Caltech in 1974, aims to get across what is probably the most important idea that any scientist must understand if they are to do science properly, which is to learn not to fool yourself. Feynman is also quoted as saying that science is "the belief in the ignorance of experts" (see here), which is another important and strongly related idea, since experts in particular can be very prone to fooling themselves, particularly when they get together for a common purpose. In brief, although I would recommend you go and read it yourself, what Feynman was trying to say is that to do science properly you should take nobody's word for it, not even your own. If you see a result and think it was caused by one thing, you must then make every effort to rule all other possibilities out before settling on the idea that you know the cause and effect relationship. Only once everything else has been ruled out can you confidently say that you know what the cause was and why the effect was what it was, and even then you must leave open the possibility that there could be something you have missed. Otherwise you run the very real risk of falling to what is known as confirmation bias, which is to only seek out things that confirm your preconceptions, rather than what you should do which is to actively seek out anything that might go against your current best guess. Only by doing the latter, and then being unable to come up with anything that could otherwise explain your results, can you stand any chance of narrowing down what it is you are after, which is of course the truth. And if you don't think there is such a thing as objective truth, then science (or indeed the real world) is not for you. Try religion instead.
The philosopher Karl Popper considered that the way in which science must work was that any scientific theory must be falsifiable for it to be a theory at all (see here for more), otherwise it was just useless. The theory of gravity is a valid theory not just because all everyday observations support it but more importantly because it could be disproved by, for example, something falling (or not falling) contrary to what the theory predicted. The theory of evolution, which has great explanation power for how life forms change over time, could also be disproved, for example by the existence of rabbits in the precambrian. Other ideas, however, cannot be properly classified as scientific theories if there is no way they could be disproved, or are so vague as to be able to cover every eventuality, especially if they do so retrospectively (astrology, for example). Such theories are useless, largely because they have no predictive power and explain nothing. Another philosopher Bertrand Russell came up with the idea of a celestial teapot as an example of a theory that could not be disproved because no matter where you looked it could always be said that you hadn't yet found it. The burden of proof for any such non-falsifiable theories must therefore fall on those who make such claims, and not on those who consider them to be false.
In what I think of as effectively an update on Feynman's cargo cult speech, Matt Ridley (author of the excellent book The Rational Optimist) recently presented a lecture on 'scientific heresies' at the RSA in Edinburgh. This lecture has been reproduced here and here (with pictures), and has been noted on Richard Dawkins' website here (although Dawkins does not necessarily agree with Ridley's conclusions). It is also available, with images, in the form of a pdf here. I cannot underestimate, or overemphasise, how important it is for anyone who thinks they know how science works to read this. If there is only one thing you read about the subject in question (and after reading it I doubt that this will remain the case), then this should be it. Once you have read it in full, please feel free to come back and tell me why, and how, he is wrong.