I've been working with the Oxford Pain Research Group — that's "Pain" without an "e", no relation. I was analysing spreadsheet data about the effect of drugs on pain and sleep in large numbers of patients, and summarising it into measures of the drugs' clinical effectiveness.
Each spreadsheet contained several sets of data; and each set was similar in structure to the other sets, but not identical in structure to them. The differences were small, but irritating. So I coded a reporting function to summarise the most urgently-wanted set; and once we were satisfied with those results, wrote routines to transform the other sets so they had the same structure as this most urgently-wanted set. Thus I could get by with only one reporting function.
I could instead have coded different reporting functions for each set of data. Or I could have sweated away designing one function general enough to accommodate the differing structures of all the different sets. Every programmer will recognise such questions of design.
Incidentally, why is the Universe so hostile to generalisation? It is so easy to copy and paste one chunk of code to give another chunk of code, and then hack the new chunk so it does something different from the first chunk. It is so difficult to edit the first chunk and parameterise it so that it does the second thing in addition to the first. Correctly.
I ask this because the quantum physicist David Deutsch has argued that the nature of computation is not a mathematical question, but a physical one. Mathematicians talk about computable functions: that minute subset, out of all functions possible within mathematics, that we can implement on a computer. Some say there is something mathematically special about them — a computational élan vital — that distinguishes them from all the other functions. That's wrong, says Deutsch. They are special because they happen to be instantiated by this Universe's laws of physics. The Universe is really really good at implementing addition; and therefore, addition is a computable function.
But generalisation isn't. And I, as a programmer, would really really like to live in a Universe where the fundamental processes that underlie note-taking and mark-making, whether in cuneiform on clay tablets, or Prolog in Pico, would implement generalisation as readily as this Universe implements addition. It would make programming so easy: always flowing downhill towards greater applicability and power, instead of towards greater accumulation of kludge, rot and cruft.
Anyway, in this Universe, my easiest route to finishing the pain-drug analysis was to build a machine that processed one set of data, then to build filters I could ram all the other sets through, forcing them into a form the machine would also accept. This used a lot of computer time; but computer time is cheap. And in explaining to my friend Sebastian Straube of the Pain Research Group, why I did things this way, I remembered the following joke:
How can you distinguish a mathematician from a physicist?
Lead them into the kitchen, give them each a kettle-full of cold water, and ask them to make a cup of tea. They will both boil the water, pour it into the tea-pot, and brew the tea.
Now give them each a kettle-full of hot water. The physicist will pour it into the tea-pot, then proceed as before. But the mathematician will empty the kettle down the sink, and fill it with cold water. This reduces the problem to one already solved.
It is a great shame that the Universe is not equipped with a built-in peephole optimiser. Had I implemented it, I would have included one as a service to all problem-solving creatures within my Creation.