Thursday, January 24, 2013

Rom 1 reply, 32: Is Intelligent Design reasoning little more than a smokescreen or a disguise -- as in Barbara Forrest's "Creationism in a cheap tuxedo" -- for pushing fundamentalist, theocratic religion into and seizing control of science?

One of the commonly used tactics in dealing with intelligent design arguments in a scientific or educational context, is to assert that it is "Creationism in a cheap tuxedo," in the words of professor Barbara Forrest (who also happens to be a member of the new Orleans Humanists, i.e. atheists).

Motive-mongering, in short, can go both ways.

But we are not locked up to ultimately useless polarisation games.

Overnight at UD, an objector to ID raised the issue of the underlying scientific credibility of design theory, so I addressed him. I think it helpful to clip that here, as it will help us put origins science in a more balanced context:

_______________

>>Time to return this thread to its proper focus, a simple test case on reliable empirical detection or recognition of design (with possibilities of identifying a metric model and from that devising a quantitative test).

It seems to me, first of all, that the Big vs Little ID distinction being advanced above [the way the accusation that ID is an illegitimate movement is being made by one particular objector] is useful only for rhetorical and distractive reasons. The best way to answer it is to go back to basics, in light of the underlying history of ideas and issues of empirically grounded scientific warrant.

So, secondly, we need to focus on the fundamental challenge of empirical warrant.

Where, since C18, science has increasingly sought to reconstruct the past based on signs in the present; using approaches that boil down to inference to best explanation on processes shown, observed — or, assumed — to be causally adequate to produce and characteristic of such signs in the present.

This, for instance, is more or less how Lyell argued for uniformitarian Geology. In that context, Darwin extended the approach to biology. About a hundred years ago, that also was extended to astrophysics, with as a precursor the suggestions on how a solar system could be created by condensing disks vs pulling out a filament from a star by a brush with another nearby star.

That is how the old world, old life picture that so dominates our contemporary view was built up.
In parallel with this, we have had the increasing rise of the view that natural — blind/purposeless processes of mechanical necessity and chance from plausible initial circumstances should be the primary or sole means of explanation used in science. Indeed we find today statements to the effect that this is the definition of scientific methods and even of science.

This is a step too far, as it is a gateway for injection of a priori materialist ideology that subverts science from being an open minded open ended empirical evidence led pursuit of the credibly warranted truth about our world, including in the remote and unobserved past. That last point is also an important epistemological limitation: we did not observe nor do we have generally accepted records from the remote past, all is reconstruction on a model timeline cumulatively built up and accepted by consensus rather than any truly direct comparison with the actual facts that happened.

In short, we see here how origins science can easily be subverted in support of materialist ideology, which ideology has been an increasing factor over the past two centuries as well.

At the same time, we also know that design exists in the world, and that it tends to leave characteristic, observable traces. That needs not be the case, but it is often the case. So, it is relevant to ask questions along lines pointed out by Plato in his The Laws, Bk X, on signs of causation by nature [blind chance + necessity] vs by ART. Where the ART-ificial may leave signs that reliably point to its action.


HMS Dreadnought; contrast this to a pile of rocks!
That is where WJM’s battleship vs a pile of rocks, presumably full of iron ore, comes from.

There is an obvious, even blatant difference.

What is it?

Apart from the processing that has transformed the ore into specific Iron based alloys — note, meteoritic iron alloys “fallen from the sky” exist, but controlled composition and co-ordinated processing that yield specific useful properties is a matter of high art — the battleship shows massive contrivance. For, it is functionally organised in highly specific and complex ways, towards a purpose or goal that may be evident from its structure and function. Just like Paley’s watch on the heath vs a stone.

By extension, too, if we were to come across an avalanche on mars, and at its foot, what is evidently a spacecraft, with heavy armour plating, weapons turrets and magazines, with co-ordinating control centres, propulsion systems etc, we would immediately infer that we were looking at a space-faring version of the same basic concept, a battleship.

But, again, what would make these different from, say, a pile of meteoritic iron blobs?

Functionally specific, complex organisation and associated information, pointing to contrivance.
But, but, but, that leaves out an absolutely important issue, namely that living systems reproduce, and can mutate giving rise to evolution!

Thus surfaces one of the longest standing strawman talking points in this whole field of investigation and discussion.

What do you mean by that?

I am of course pointing out how there is a lot of discussion on how Paley blundered by failing to address a key disanalogy between machinery and living forms. This is wrong, grossly and culpably wrong. In fact, by Ch II of his 1806 work (and notice, this is a generation AFTER Hume so it would have been reasonable to have expected Paley to answer the disanalogy argument, and any fair review of Paley should therefore address this . . . ], we may simply read how he extended his watch example through an in-principle thought exercise:
Suppose, in the next place, that the person who found the watch should after some time discover that, in addition to all the properties which he had hitherto observed in it, it possessed the unexpected property of producing in the course of its movement another watch like itself — the thing is conceivable; that it contained within it a mechanism, a system of parts — a mold, for instance, or a complex adjustment of lathes, baffles, and other tools — evidently and separately calculated for this purpose . . . .
The first effect would be to increase his admiration of the contrivance, and his conviction of the consummate skill of the contriver. Whether he regarded the object of the contrivance, the distinct apparatus, the intricate, yet in many parts intelligible mechanism by which it was carried on, he would perceive in this new observation nothing but an additional reason for doing what he had already done — for referring the construction of the watch to design and to supreme art . . . . He would reflect, that though the watch before him were, in some sense, the maker of the watch, which, was fabricated in the course of its movements, yet it was in a very different sense from that in which a carpenter, for instance, is the maker of a chair — the author of its contrivance, the cause of the relation of its parts to their use.
Paley of course was at least a generation too early to have advantage of Babbage’s work and over a century too early to have had that of von Neumann’s work on kinematic self replicating automata. But he nailed the heart of the matter: self replication is a further instance of contrivance, not a disanalogy to it. We may then multiply this insight by using one of the results from von Neumann et al, namely, that the stored information controlling the universal constructor is a pivotal issue that has to be explained, in the context of the implied irreducibly complex system.

This of course brings the origin of life conundrum for a priori materialist blind chance and mechanical necessity driven paradigms to centre stage. And no, this cannot be artificially severed from the onward development of life forms that requires explanation of further increments of such information. We here deal with the root of Darwin’s famous tree of life. (In context, it is highly instructive to me that the only illustration in Darwin’s Origin, the Tree of Life, would have no tracing back to the obviously required root. No root, no shoot and no branches, period.)

In short the disanalogy argument fails and has failed ever since 1806, but has been propped up through a strawman tactic that counted on the inaccessibility of Paley’s actual onward argument in Ch II as outlined.

That brings us to the issue firmly put on the table by Wicken and Orgel in the 1970′s as results and challenges for OOL research, not soundly answered from an a priori materialist perspective to this day, a full generation later:
ORGEL, 1973: . . . In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity. [[The Origins of Life (John Wiley, 1973), p. 189.]
WICKEN, 1979: ‘Organized’ systems are to be carefully distinguished from ‘ordered’ systems. Neither kind of system is ‘random,’ but whereas ordered systems are generated according to simple algorithms [[i.e. “simple” force laws acting on objects starting from arbitrary and common- place initial conditions] and therefore lack complexity, organized systems must be assembled element by element according to an [[originally . . . ] external ‘wiring diagram’ with a high information content . . . Organization, then, is functional complexity and carries information. It is non-random by design or by selection, rather than by the a priori necessity of crystallographic ‘order.’ [[“The Generation of Complexity in Evolution: A Thermodynamic and Information-Theoretical Discussion,” Journal of Theoretical Biology, 77 (April 1979): p. 353, of pp. 349-65. (Emphases and notes added. Nb: “originally” is added to highlight that for self-replicating systems, the blue print can be built-in.)]
Thus, pace the objections above and elsewhere, we see the central importance of functionally specific, complex organisation and/or associated information [FSCO/I] in understanding distinguishing characteristics of life forms. (We also see where the term comes from — citation of Wicken in TMLO — and also a historic root of the more general term often used by Dembski et al, Specified complexity and/or complex specified information. Note WmAD has emphasised that in biological systems, the specification is cashed out in terms of function, in various ways. So, FSCO/I, and onwards particularly digitally coded functionally specific complex information, dFSCI, are where the crux of the matter lies.)

Disanalogy arguments, from Paley to von Neumann, cannot properly be used to sweep FSCO/I off the table.

And, the OOL context, where there was no pre-existing code based, information controlled replication system, is shown to be pivotal.

Indeed, that is the exact context where these issues emerged, once the molecular biology results had come in from the early 1950′s to 70′s.

And, the explanation of the root of the tree of life will then be pivotal to explaining the onward branching and diversification across body plans.

So, the issue we are looking at is absolutely pivotal and potentially revolutionary.

This is no mere backwater side issue that can be brushed aside as irrelevant and useless.

Now, let us extend our space battleship thought exercise.

It is 2080, and we are in initial stages of exploring the Asteroid belt, now with a global space consortium under UN auspices, say.

A nickel-iron asteroid with a cluster of close by heavy metal and rare metal asteroids is discovered. In exploring it, we see a similar battleship, and suddenly we begin to understand the robotic instrumentation in certain parts of the previous ship, for here we find a wrecked ship that was in the process of replicating itself and evidently was using a von Neumann self replication mechanism. Right next to the wreck, which has an obviously targetted hole through it, we find a partially completed vessel of obviously similar design, and ewe find idled robots that had been at work. Tracing back, we find advanced programming systems and information storage units that guided the robots in accordance with a blueprint. there are even foundry facilities that seem to make exotic alloys and materials using nanotechnologies.

Now, you tell me that under these circumstances, the scientists involved in the exhibition will draw the conclusion that the space ships were now proved NOT to have been designed, as the existence of a self replicating mechanism proves that they must somehow have spontaneously evolved from meteoric materials as a strange life form, and that the origin of the complex functional form can be explained on survival of the fittest.

Do you see how hollow disanalogy arguments sound to people with an engineering or applied science background, once we see the issue of FSCO/I coming to bear?

That is why Denton’s point in his Evolution, a theory in crisis, from 1985 is still so relevant:
To grasp the reality of life as it has been revealed by molecular biology, we must magnify a cell a thousand million times until it is twenty kilometers in diameter [[so each atom in it would be “the size of a tennis ball”] and resembles a giant airship large enough to cover a great city like London or New York. What we would then see would be an object of unparalleled complexity and adaptive design. On the surface of the cell we would see millions of openings, like the port holes of a vast space ship, opening and closing to allow a continual stream of materials to flow in and out. If we were to enter one of these openings we would find ourselves in a world of supreme technology and bewildering complexity. We would see endless highly organized corridors and conduits branching in every direction away from the perimeter of the cell, some leading to the central memory bank in the nucleus and others to assembly plants and processing units. The nucleus itself would be a vast spherical chamber more than a kilometer in diameter, resembling a geodesic dome inside of which we would see, all neatly stacked together in ordered arrays, the miles of coiled chains of the DNA molecules. A huge range of products and raw materials would shuttle along all the manifold conduits in a highly ordered fashion to and from all the various assembly plants in the outer regions of the cell.
We would wonder at the level of control implicit in the movement of so many objects down so many seemingly endless conduits, all in perfect unison. We would see all around us, in every direction we looked, all sorts of robot-like machines . . . . We would see that nearly every feature of our own advanced machines had its analogue in the cell: artificial languages and their decoding systems, memory banks for information storage and retrieval, elegant control systems regulating the automated assembly of components, error fail-safe and proof-reading devices used for quality control, assembly processes involving the principle of prefabrication and modular construction . . . . However, it would be a factory which would have one capacity not equaled in any of our own most advanced machines, for it would be capable of replicating its entire structure within a matter of a few hours . . . .
Unlike our own pseudo-automated assembly plants, where external controls are being continually applied, the cell’s manufacturing capability is entirely self-regulated . . . .
[[Denton, Michael, Evolution: A Theory in Crisis, Adler, 1986, pp. 327 – 331.]
The bottomline is that, on billions of test cases, without good counterinstance, we know the characteristic cause of FSCO/I. Design. Unless and until it has been shown that blind chance and mechanical necessity can effectively give rise to such systems, we have every epistemic right to infer that FSCO/I is a reliable signs of design as cause.

And, the strategy of applying sampling theory to give us a threshold of complexity beyond which the explicit or implied info in an object could not credibly have come about by chance, is then a reasonable model and metric:


Chi_500 = I*S = 500, bits beyond the solar system threshold.

Where we can give some biological results in light of the Durston et al results, discussed in the just linked:
RecA: 242 AA, 832 fits, Chi: 332 bits beyond
SecY: 342 AA, 688 fits, Chi: 188 bits beyond
Corona S2: 445 AA, 1285 fits, Chi: 785 bits beyond
Finally, observe: at no point in my discussion has there been an inference to the supernatural, just to intelligence. That is, the “injecting the [irrational and chaotic] supernatural into science” talking point is a strawman, laced with ad hominems and often set alight with further incendiary remarks about right wing theocratic conspiracies aimed at imposing fascism. (But, BTW, fascism is actually a STATIST — thus leftist — ideology [one pivoting on the emergence of a nihilistic Nietzschean superman political messiah gifted and anointed to deliver the victim group in the face of allegedly unprecedented crisis . . . ], as can be seen from the thought roots of Mussolini and the very name of the analogue in Germany, the National Socialist German Workers Party.)

That strawman, too, needs to be laid to rest.

So, now, can we deal with the pivotal issue on the table, on its scientific merits?>>
_______________

Whatever ideological and philosophical debates one wants to engage, there is a serious scientific point being made by the Intelligent Design thinkers and scientists. END