A quote from decomplexity's Amazon review: Kauffman's start point is autocatalysis: that it is very likely that self-reproducing molecular systems will form in any large and sufficiently complex chemical reaction. He then goes on to investigate what qualities a physical system must have to be an autonomous agent. His aim is to define a new law of thermodynamics for those systems such as the biosphere that may be hovering in a state of self-organised criticality and are certainly far from thermodynamic equilibrium. This necessitates a rather more detailed coverage of Carnot work cycles and information compressibility than was covered in passing in his earlier books. It leads to the idea that a molecular autonomous agent is a self-reproducing molecular system capable of carrying out one or more work cycles.
But Kauffman now pushes on further into stranger and uncharted territory. The Universe, he posits, is not yet old enough to have synthesised more than a minute subset of the total number of possible proteins. This leads to the fundamental proposition that the biosphere of which we are part cannot have reached all its possible states. The ones not yet attained - the `adjacent possible' as Kauffman terms it - are unpredictable since they are the result of the interaction of the large collection of autonomous agents: us - or rather our genes - and all the other evolving things in the external world. His new fourth law of thermodynamics for self-constructing systems implies that they will try to expand into the `adjacent possible' by trying to maximise the number of types of events that can happen next.The book covers more than that (see the rest of the quoted review) but we focused on the early part of the book. Kauffman points out that he wasn't really doing science and he's right about that. However, he had a number of ideas arranged in a sequence that made some sense to us... the trick was that in between the ideas there appeared to be high flying prose and analogies to mathematical concepts where perhaps the details of the original math weren't being faithfully imported. Or maybe they were and we just couldn't see it? It was interesting to hypothesize the existence of mechanical connections and see if we could reconstruct some of them.
We spent some time with autocatalytic sets and looked into some assumptions about what it took to make them work right (various kinds of neighborliness of molecular species, differential rates of reaction, etc) especially the presumption that real chemistry possessed the "algorithmic generosity" required. It inspired an interesting analogy to the "debates" between working biologists and theists promoting "intelligent design"... one could imagine people insisting that autocatalysis was a sufficient "algorithm" to explain biogenesis while another group insisted that chemistry had to work in certain ways for the algorithm to successfully operate and that the fact that chemistry did work in such ways was evidence that it had been "designed".
There was also some discussion around Kauffman's claims that the processes or parameters of evolution could not be pre-stated or predicted in any meaningful way. It seems that he was inspired by theorems about computability but it would have been nice if he'd spent more time wondering if the axioms involved in those theorems really applied to biology at the level of biology that humans are interested in. It appeared that he believed biological systems were doing something "non algorithmic" in the sense that you'd have to know every detail of everything to predict what an ecosystem (or an economy) would think up next. It would have been nice if his analogy for scientific theories was "lossy compressions of reality with possibly calculable error bounds" instead of something more pristine. (Mysterians were mentioned as having a vaguely similar attitude towards cognition... seeming to want to find something that was impossible to understand.)