Saturday, September 29, 2007

Nativist Chemistry?

After a summer hiatus, the salon met on September 23rd to discuss a book by Stuart Kauffman:


A quote from decomplexity's Amazon review: Kauffman's start point is autocatalysis: that it is very likely that self-reproducing molecular systems will form in any large and sufficiently complex chemical reaction. He then goes on to investigate what qualities a physical system must have to be an autonomous agent. His aim is to define a new law of thermodynamics for those systems such as the biosphere that may be hovering in a state of self-organised criticality and are certainly far from thermodynamic equilibrium. This necessitates a rather more detailed coverage of Carnot work cycles and information compressibility than was covered in passing in his earlier books. It leads to the idea that a molecular autonomous agent is a self-reproducing molecular system capable of carrying out one or more work cycles.

But Kauffman now pushes on further into stranger and uncharted territory. The Universe, he posits, is not yet old enough to have synthesised more than a minute subset of the total number of possible proteins. This leads to the fundamental proposition that the biosphere of which we are part cannot have reached all its possible states. The ones not yet attained - the `adjacent possible' as Kauffman terms it - are unpredictable since they are the result of the interaction of the large collection of autonomous agents: us - or rather our genes - and all the other evolving things in the external world. His new fourth law of thermodynamics for self-constructing systems implies that they will try to expand into the `adjacent possible' by trying to maximise the number of types of events that can happen next.

The book covers more than that (see the rest of the quoted review) but we focused on the early part of the book. Kauffman points out that he wasn't really doing science and he's right about that. However, he had a number of ideas arranged in a sequence that made some sense to us... the trick was that in between the ideas there appeared to be high flying prose and analogies to mathematical concepts where perhaps the details of the original math weren't being faithfully imported. Or maybe they were and we just couldn't see it? It was interesting to hypothesize the existence of mechanical connections and see if we could reconstruct some of them.

We spent some time with autocatalytic sets and looked into some assumptions about what it took to make them work right (various kinds of neighborliness of molecular species, differential rates of reaction, etc) especially the presumption that real chemistry possessed the "algorithmic generosity" required. It inspired an interesting analogy to the "debates" between working biologists and theists promoting "intelligent design"... one could imagine people insisting that autocatalysis was a sufficient "algorithm" to explain biogenesis while another group insisted that chemistry had to work in certain ways for the algorithm to successfully operate and that the fact that chemistry did work in such ways was evidence that it had been "designed".

There was also some discussion around Kauffman's claims that the processes or parameters of evolution could not be pre-stated or predicted in any meaningful way. It seems that he was inspired by theorems about computability but it would have been nice if he'd spent more time wondering if the axioms involved in those theorems really applied to biology at the level of biology that humans are interested in. It appeared that he believed biological systems were doing something "non algorithmic" in the sense that you'd have to know every detail of everything to predict what an ecosystem (or an economy) would think up next. It would have been nice if his analogy for scientific theories was "lossy compressions of reality with possibly calculable error bounds" instead of something more pristine. (Mysterians were mentioned as having a vaguely similar attitude towards cognition... seeming to want to find something that was impossible to understand.)

1 comment:

Jennifer Mueller said...

I just wanted to add a rif that didn't seem appropriate to the main summary or the general theme of the blog. Call this a footnote :-)

The idea came up when it developed that a theory that worked sort of like Intelligent Design could start with the idea that chemistry supports biogenesis and evolution because of the "design of chemistry".

Intelligent Design is spending so much time trying to convince people that "God made it that way, isn't he amazing!" that they don't seem to be looking at any of the scary implications that can fall out of that thesis depending on where and how you think God intervenes to produce "design".

The scariness works, not just for chemistry but for a variety of levels of "design of the universe" that god would presumptively have set up.

Consider "Deist Intelligent Design" where you combine a clock-maker god with all the evidence of the age of the universe and the details and mechanisms of evolution. God apparently (within Deist Intelligent Design) thinks it's a good idea to spend billions of years on microbes. God also thinks predator-prey cycles are all good. And God really seems to like a mass extinction event once in a while. And he also apparently is down with red queen dynamics. And so on and so on...

If the creationists aren't careful with the specifics of how God is supposed to be imposing design on biological systems they might walk right into a scary sounding (to me, anyway) combination of "theological social darwinism" where some nasty dynamics of what is now consided "natural selection" gain moral status as "selection by god through his providence".