Research on Intelligent Design

To put together scientific advances from the perspective of Intelligent Design.

Monday, October 31, 2005

Index 1 for Research on Intelligent Design

Research on Intelligent Design:

INDEX (Tue., Sept. 27 – Mon. Oct. 31, 2005)

# TITLE
1 Dear Reader, Introduction. I started my trip ... Tue., Sept. 27, 05
2 Our "Inner Biological Software"... Wed., Sept. 28, 05
3 Intelligent Design, Biodiversity and Antiobesity ... Thu., Sept. 29, 05
4 In Search of an Intelligent Understanding of Variation... Fri., Sept. 30, 05
5 Key Resources For Parents And School Board Members... Sat., Oct. 01, 05
6 Excerpts of Stephen C. Meyer's Peer Reviewed Article Sun., Oct. 02, 05
7 Intelligent Design and Genetics of Bacterial Flagellum Mon., Oct. 03, 05
8 The Gull Variation. Tue., Oct. 04, 05
9 Intelligent Design for the Mechanism of Centrioles, Wells Wed., Oct. 05, 05
10 A Computer Model for the Complex Plasticity of Proteins Thu., Oct. 06, 05
11 Theory of Organismal Problem-Solving (TOPS), J. Wells Fri., Oct. 07, 05
12 The Privileged Planet (the Video Documentary) Sat., Oct. 08, 05
13 Irreducible Complexity (IC) – Seminar by M. Behe Sun., Oct. 09, 05
14 The Three Objectives of Mendelian Bioengineering Mon., Oct. 10, 05
15 Dynamic Genomes, Morphological Stasis and IC Tue., Oct. 11, 05
16 Chromosome rearrangements and transposable elements Wed., Oct. 12, 05
17 Biological Function and the Genetic Code are Interdependent Thu., Oct. 13, 05
18 Paradigm of Design: The Bacterial Flagellum, Scott Minnich Fri., Oct. 14, 05
19 Specified Complexity, Robert Shapiro and William Dembski Sat., Oct. 15, 05
20 Unlocking the Mystery of Life (the Video Documentary) Sun., Oct. 16, 05
21 Specification: The Pattern That Signifies Intelligence Mon., Oct. 17, 05
22 Palindromati. Tue., Oct. 18, 05
23 Manuscript on Palindromes, an expert’s comment Tue., Oct. 18, 05
24 Three 'Comments' on Palindromes in the Genbank Wed., Oct. 19, 05
25 "Anonymous" criticizes such Palindromatic discovery Thu., Oct. 20, 05
26 A functional entropy model for biological sequences Fri., Oct. 21, 05
27 Casting Down the Icons of Evolution, Jonathan Wells Speaks Sat., Oct. 22, 05
28 Icons of Evolution Exposed in a Panel Discussion Sun., Oct. 23, 05
29 The Origin of Intelligent Design, by Jonathan Witt Mon., Oct. 24, 05
30 Icons of Evolution, the Documentary (Video) Tue., Oct. 25, 05
31 Intelligent Design Predicted Transcription to be Proofread Wed., Oct. 26, 05
32 Michael Denton - Evolution: A Theory in Crisis. Thu, Oct 27, 05
33 The Crayfish Variation Fri Oct 28, 2005
34 Mendel, Paley and Babbage, casting down Darwinian Fairytales Sat Oct 29, 2005
35 Endogenous Adaptive Mutagenesis (EAM) Sun Oct 30, 2005
36 Teleological comment, Sen. McCain in “Character Is Destiny“ Mon, Oct 31, 2005
37 Index Mon, Oct 31, 2005

सोंग्बिर्द McCain

McCain's Presidential Ambitions Set To Be Swiftboated
Swift Boat Veterans for Truth member set to launch counter-offensive to reclaim Republican Party:
http://www.prisonplanet.com/articles/february2008/020808_presidential_ambitions.हतं

Top Cop Says McCain Was Never Tortured
Former Vietnam vet with top secret clearance - Republican frontrunner is "a lying skunk"
http://www।prisonplanet.com/articles/february2008/020708_never_tortured.htm

McCain Loses It and Flees After 9/11 Truth Questions
Republican presidential candidate John McCain was literally overwhelmed by reporters from WeAreChange.org and Infowars.com seeking 9/11 truth during a campaign stop.
http://arabesque911।blogspot.com/2007/07/mccain-loses-it-and-flees-after-911.html


-----------------------------

Teleological comments by Senator McCain in his book “Character Is Destiny“

After quoting Darwin in relation to the richness and diversity of life on Earth, Sen. McCain declared:


"I don’t see why that magnificence excludes religious faith from its interpretation"


In the same Chapter, Senator John McCain addressed evolutionists by telling them to allow other colleagues (both, researchers and teachers) their right for the


"perception of divine purpose"


Finally, talking about the same scientists and academics that believe in God, Sen. McCain and co-author M. Salter told evolutionists to let


"the faithful see the hand of God in nature"


What is wrong with that?

Source: Lyric Wallwork Winik, Intelligence Report; Parade, p. 15, Oct. 30, 2005.

An Update:

Tucson Region. McCain sounds like presidential hopeful . By C.J. Karamargin
Arizona Daily Star. Tucson, Arizona Published: 08.24.2005


"On Tuesday... he [McCain] sided with the president on two issues that have made headlines recently: teaching intelligent design in schools and…

McCain told the Star that, like Bush, he believes “all points of viewshould be available to students studying the origins of mankind.

The theory of intelligent design says life is too complex to have developed through evolution, and that a higher power must have had a hand in guiding it."

This information was found at:

Doubting Darwin. The Intelligent Design Website of Samuel S. Chen. Website focused on the philosophical, political, and educational implications of intelligent design and evolution.

Sunday, October 30, 2005

Endogenous Adaptive Mutagenesis (EAM)

Endogenous Adaptive Mutagenesis. (2005). ISCID Encyclopedia of Science and Philosophy. Retrieved October 30, 2005 from http://www.iscid.org/encyclopedia/Endogenous_Adaptive_Mutagenesis

Endogenous Adaptive Mutagenesis refers to an approach to evolutionary theory which finds its mechanism, (that is, the causal explanation for biological evolution), within the organism itself, not in any external agent.

EAM holds that adaptation is reactive - that is, that it does not begin until after the environment induces an adaptive reaction in the organism. Also, it begins precisely at that organic point where the environmental pressure is applied, not necessarily at that part of the organism known as the genome, unless that is where the pressure is being applied. EAM is, therefore, an 'adaptive' mechanism, not a 'selective' mechanism, (such as is the famous, Random Genetic Mutation plus Natural Selection). More importantly, adaptation and adaptive evolution are seen as intentional dynamic processes, rather than as accidental and coincidental, passively experienced, anomalous events.

EAM is a process that involves non-mechanical, non-physical, phenomena, such as self-awareness, cellular intelligence, memory, intention, and other aspects of 'mind'. These aspects exist to some extent in all life forms, but one aspect of human minds, i.e., conscious analytical thought, is seemingly reserved to Homo sapiens. These non-physical aspects guide and direct the behaviour of the organs, the cells, and the physical components within the cells. This requires a soma to germ cell line of communication, which is now acceptable thanks to recent evidence of the dissolution of 'Weismann's barrier', and the instability of Crick's 'Central Dogma'.

EAM requires that organisms use these mental phenomena to actively attempt to 'learn' to adapt, by means of a trial and error heuristic experience in which a 'best available solution' is sought to a specific 'problem'. Some solutions are sufficient, some aren't. This 'learning' can be inferred from observed effects, particularly in developmental biology. Comparisons with the immune system's network of cellular communications look promising. Biosemiotics offers insights into biological information systems. Meanwhile, 'Quorum Sensing' in bacteria and 'Collective Intelligence' in eusocial insects provide empirical support for 'organismic learning'.

[EAM] holds that every organism possesses intelligence to some degree, and that it uses that intelligence in an unconscious, instinctive way, to redesign itself and/or its behaviour, and that of its offspring, in the face of novel, crucial environmental demands. Ecological adaptedness , that is, balance between environmental pressure and an organism's capacities, replaces the 'competitive' Darwinian notion of differential 'fitness' between organisms, in the teleology of EAM.

Mike Turner


Additional Comments by M. Turner:

"The most important characteristic of this model is that it considers beneficial, (that is, adaptive), mutations, to be internally self-generated by the organism, and not by accidental, random changes to its genome. These self-generated, endogenous, mutations take place in the organism's morphology or behaviour prior to being recorded for posterity in the genome. Through epigenetic inheritance these changes may be preserved, "where changed ecological conditions persist", over several generations, until finally being coded into the genome itself. A variation of the 'Baldwin Effect' would be involved. This requires a soma to germ cell line of communication... It has not yet been conclusively demonstrated from direct observation, but, like gravity, can be inferred from observed effects..."


Links:

"Wired" discovers Endogenous Adaptive Mutagenesis
Your DNA Isn't Your Destiny, by Brandon Keim. Aug. 16, 2005

Endogenous Adaptive Mutagenesis at ARN.

Comments on "Endogenous Adaptive Mutagenesis".

Non-mechanical ontology in the explanation of organism and evolution. John J. Kineman and Jesse R. Kineman. Bear Mountain Institute .

Intelligence at the cellular level

Model of an Internal Evolutionary Mechanism (based on an extension to homeostasis) linking Stationary-Phase Mutations to The Baldwin Effect.

A 21st Century View of evolution

Adaptive developmental plasticity in snakes

Snakes challenge nature vs nurture debate

Other postings by mturner at ARN

Saturday, October 29, 2005

Mendel, Paley and Babbage, casting down Darwinian Fairytales

Even if you may not agree with everything that David Stove wrote, there are valuable observations made by him in his book Darwinian Fairytales, for example of the historical developments of XIX Century discoveries compatible to the current framework of Intelligent Design:

“The discovery of genes… was remarkably long drawn out in time. It extended from Mendel's work in the 1860s on crossing various strains of peas, through the rediscovery of that work in 1900, to at least the early breeding experiments of T.H. Morgan during the first world war. Now, was any of this effortless? Surely, on the contrary, Morgan and his associates had first to acquire a good deal of biological information, and their work rather hard and long with their heads and hands, to design, perform and interpret their experiments? in 1900 Bateson perceived, though few other people did, what Mendel’s experiments on peas really meant: and I suppose that this difference between Bateson and most other people must have had something to do with his vast fund of biological information and with prolonged and severe exercise of his penetrating intelligence.”

“But in all of this, easily the greatest feat of intellectual penetration was that of Mendel himself. The phenomena of inheritance are so bewilderingly various, that no one before Mendel, not even the most expert breeders of plants and animals, had ever been able to 'see the wood for the trees'. Yet in order to be understood, these phenomena only required to be looked at in the light of two ideas - that the 'factors' of inheritance do not blend in the offspring, and that they assort themselves independently of one another - ideas which, as R.A. Fisher suggested [for example, Fisher, R. A. (1930, Oxford University Press: l958), The Genetical Theory of Natural Selection, Dover Publications, New York, pp. 7-9.], had been as available to anyone, for thousands of years, as they were to Mendel. What… Mendel did - he tried harder: he concentrated his mental gaze for years on the vast jumble of apparently meaningless ratios of inherited characteristics in his peas, until he obliged these speechless witnesses to yield their secret.”

“During his life, of course, and for 16 years after his death, Mendel’s achievement went-not only unappreciated but unnoticed. If only, now, he could have had a Dawkins to advise him on literary marketing! But this comparison, between the laborious but glorious scientific discovery of genes, and Dawkins' effortless philosophical pseudo discovery of 'memes', is too painful to be pursued for long. It excites too much indignation and contempt for the latter.”

p. 132

Essay X - Paley's Revenge or Purpose Regained

p. 178

"... the famous old 'design argument' for the existence of God ...received its classic formulation in William Paley's Natural Theology, (1802). But of course Paley did not invent the argument. For centuries before he wrote, it had been carrying conviction to almost every rational and educated mind."

"It continued to do so for another 50 years after Paley wrote. This is a historical fact which deserves to be known and reflected upon, yet it has been almost completely forgotten. Far from having suffered a fatal blow at Hume's hands in 1779, the design argument entered the period of its greatest flourishing only between 1800 and 1850. In 1829, for example, the Earl of Bridgewater provided a large sun in his will for a series of books to be written by the ablest authors, which would argue, not from revelation or from authority but rationally, for 'the Power, Wisdom, and Goodness of God, as manifested in the Creation.' [From a 'Notice' prefixed to Bell, Sir C. (1874), The Hand, (9th edition), George Bell and Son, London.]"

"The 'Bridgewater Treatises' duly came to be published, and they where written by the best authors. In retrospect, one in particular stands out. This was The Hand, (1833), by Sir Charles Bell: the greatest of all British physiologists after Harvey. Yes, that's right a whole book on the human hand, as evidence of the existence, intelligence, power and benevolence of God, only 26 years before The Origin of Species appeared! And it is - even if no one in the whole world now cares to know the fact - a very good book indeed."

p. 181

"... that sacred particle [the seed]."

Paley, Natural Theology (below, link to this book Online)

“We would all say, because we all know it to be true, that calculating-machines, automobiles, screwdrivers and the like, are just tools or devices which are designed, made, and manipulated by human beings for their own ends. Now, you cannot say this without implying that human beings are more intelligent and capable than calculators, automobiles, screwdrivers, etc.”

P. 171

"... someone who has tried in recent decades, as I have, to convince silly undergraduates of the merits of Paley's classic book..."

"... in the last 30 years, Paley has had his revenge on Darwinism, for more than a century of undeserved contempt. The explanation of adaptation by reference to the purposes of intelligent and powerful agents has come back into its own. And its reinstatement has turned out to require only some comparatively minor changes to the theology involved."

p. 182

"It is important to realise, (and pleasant to record), that the vulgar contempt for the design argument was never shared by Darwin, or by any intelligent Darwinians who belong to what might be called 'the pure strain' of intellectual descent from him. Well, this fact might have been anticipated. In any game, the formidable players are the best judges as to which of their opponents are formidable, and which are not."

"When he [Darwin] was an undergraduate at Cambridge, Darwin was required to study Paley's Evidences of Christianity, (1794). He tells us in his autobiography that 'the logic of this book and, as I may add, of his "Natural Theology", gave me as much delight as did Euclid.' Again: 'I do not think I hardly ever admired a book more than Paley's "Natural Theology". I could almost formerly have said it by heart.' [The first of these passages is from Darwin, F. (ed ) (l888), The Life and Letters of Charles Darwin, John Murray, London, Vol. l, p. 47; the second passage is from ibid., Vol. II, p.219.]

"Richard Dawkins, likewise, is full of a proper respect for Paley's explanation of adaptation. He even thinks so well of it that he cannot, he tells us, 'imagine anyone being an atheist at any time before 1859' [the year of agnostic Darwin's Origin of Species; quoted from Dawkins, R. (1986), The Blind Watchmaker, Longman, p. 5.]"

"Dawkins has some disagreements with Paley, of course; but this really is a matter of course. When did two theists ever agree on all points? For example, Paley believed in the benevolence of God: see his chapter XXCI, 'Of the Goodness of the Deity'. Dawkins, on the other hand, as we saw in Essay VII, ascribes to the gods of his religion a ruthlessly selfish character."

p. 183

"... Williams, as though he felt he had still not done enough homage to the author of Natural Theology, goes out of his way to quote and praise a passage of Paley, on the subject of - of all shop soiled examples! - the human eye. The passage is instructive, but too long to be quoted here [Williams, G. C. (1974), Adaptation and Natural Selection, Princeton Paperback, pp.258-9]. I suspect that Williams wrote it partly for the purpose of shocking the duller witted, or more historically ignorant, of his fellow Darwinians."

p. 185

“Dawkins, in order to make clear the great difference between the Paleyan explanation of adaptation and his own Darwinian one, writes (for example) as follows. 'Natural selection ... has no purpose in mind. It has no mind and no mind's eye. It does not plan for the future. It has no vision, no foresight, no sight at all.' [Dawkins, R. (1979), The Selfish Gene, Paladin Books, p. 5]"

p. 186

“Darwinians have always owed their readers a translation manual that would 'cash' the teleological language which Darwinians avail themselves of without restraint in explaining particular adaptations, into the non-teleological language which their own theory of adaptation requires. But they have never paid, or even tried to pay, this debt.”

"Nor have any Darwinians ever given, to this day, any such reconciliation of their theory with the teleological language which they employ as freely as though they were disciples, not of Darwin, but of Paley. Presumably the reason that they have not, is the same as the reason Darwin did not.”

“I am not suggesting that Darwin should not have used, or that a Darwinian should not use, teleological language when trying to explain particular adaptations. That would be a hopelessly doctrinaire and impracticable suggestion. A biologist, whether of Darwin's time or ours, can hardly frame a single thought, concerning adaptations, which does not involve intendedness on purposefulness. To ask him to purge his mind of all such thoughts, and never to use words like 'purpose', 'function', or 'contrivance', would amount in practice to telling him to stop thinking about adaptation altogether."

"I do say, though, that Darwinians cannot reasonably expect, any more than anyone else can, to be allowed to have things both ways. They cannot, on the one hand, describe adaptations as contrivances for this or as designed for that, while denying that they mean that these adaptations were ever intended; and on the other hand, decline to explain what they do mean by expressions like 'designed for' and 'contrivance for'."

"Darwinians, then, have never paid, or even acknowledged, the debt they have all along owed the public: a reconciliation of their teleological explanations of particular adaptations, with their non-teleological explanation of adaptation in general. And not only have they never paid this debt: they have in fact become progressively less conscious, with time, of the fact that they owe this debt."

p. 191

“…Darwinians, rather than admit that their theory is simply not true of our species, brazenly shift the blame, and designate all of those characteristics 'biological errors'…”

p. 221

-------------------------------------------------------

Note: Stove's emphasis. In bold, the pages where the quotations (above of them) are to be found in the original book:

David C. Stove. Darwinian Fairytales (pp. 82 & 113), (pdf in big zip file, 16.3mb) http://www.realist.org/files

Links for the topics of this excerpts:

The Classic Genetic Works of Mendel, Bateson, Morgan et al:

http://www.esp.org/foundations/genetics/classical/browse/author-s-lst.html

Paley’s Natural Theology
[William Paley. Natural Theology; or, Evidences of the Existence and Attributes of the Deity. 12th Ed. Printed for J. Faulder. London. 1809.]

William Paley. Evidences of Christianity. 1794
http://www.wmcarey.edu/carey/paley/paley.htm

Biography of William Paley.

The Bridgewater Treatises On the Power Wisdom and Goodness of God As Manifested in the Creation
http://www.victorianweb.org/science/bridgewater.html

The Ninth Bridgewater Treatise, 1837. Charles Babbage (2nd ed. London, 1838.)
http://www.victorianweb.org/science/science_texts/bridgewater/babbage_intro.htm

Euclid's books Online:
http://etext.library.adelaide.edu.au/e/euclid
http://aleph0.clarku.edu/~djoyce/java/elements/toc.html

Sir Charles Bell, author of the Bridgewater Treatise named "The Hand".
http://www.nndb.com/people/118/000100815

Other excerpts by David Stove related to design and teleology.

Friday, October 28, 2005

The Crayfish Variation

By attention to one of our distinguished guests, I want to present today “the crayfish variation” from an Intelligent Design perspective, which help us to see purpose, order, a stringent quality control between compatible organisms as well as a boundary, which is the natural limits to biological change, leading us to determine and to produce new varieties of organisms, instead of the fearful, messy, boundless and careless Darwinian, Neo-Darwinian and Evolutionary 'Origin of Species' and its side dish, the flawed concept of 'speciation'.

I think that concepts such as Darwin’s 'Origin of Species' and the Neo-Darwinian and Evolutionary concepts of 'speciation' are artifacts of a pathological Darwinian frame of mind that can be healed by the practical use of Intelligent Design.

Previously, I wrote the next posting at ISCID. Now I want to illustrate it further with some links blended into its text:

Hybrids Consummate Species Invasion. Wade Roush. Science 277(5324):316-317 (Jul., 1997).

Quote:
"Biologists at the University of Notre Dame in Indiana are finding that the local crayfish are having their own effect on the invader [crayfish], as the two species produce a new population of vigorous hybrids. The finding is a surprise, researchers say, because ecologists often expect animal hybrids to be sterile, unable to play more than a bit part in species invasions. But at the Annual Evolution and Natural History meetings here, William Perry, a graduate student in the labs of ecologist David Lodge and biologist Jeff Feder, described molecular studies showing that hybrids of Kentucky native Orconectes rusticus, or the rusty crayfish, and a native crayfish, O. propinquus (the blue crayfish), are indeed fertile."

Which means that those crayfishes, even when mistakenly named as members of two different species (thence, supposedly 'speciated'), in reality they are only genetically compatible members, varieties, of the same group.

From the same Article:

Quote:

"…these hybrids are outcompeting both natives and invaders. The rusty crayfish, it appears, is taking over by assimilation… hybrids were assumed to be less important than other species-replacement mechanisms… backcrosses between hybrids and rusty crayfish were nearly as common as first-generation hybrids, indicating that hybrids are fertile and that they tend to mate with rusty crayfish rather than with each other. Together, the first generation hybrids and backcrosses accounted for 30 % of the crayfish in one lake. The apparent prowess of the hybrids may be speeding the invasion. When Perry put rusty and blue crayfish in tanks with similarly sized hybrids, the hybrids beat both species in competition for food - such as insects and aquatic plants - and for shelters under rock piles. They are actually more competitive than the invader…"

This is a profitable example of subspeciation or microevolution from the University of Notre Dame in Indiana. A work with practical results…

To seek for natural limits to biological change is out of the realms of the current Evolutionism and Neo-Darwinism, as they are busy trying to demonstrate a non-granted 'speciation'. Current Biologists influenced by Evolutionism are defining, as events of ‘speciation’, events that are indeed of ‘variation’ within compatible organisms.

The current biology dominated by Darwinism, Neo-Darwinism and Evolutionism generally presents 'to-date' a negative report of interbreeding between different varieties of compatible organisms that are being considered ‘as if’ being members of different ‘species’, fearing and deploring the ‘loss of the parental lines’.

However, I think that Intelligent Design can heal that negative Darwinian frame of mind in the current biology by leading the deliberate and controlled generation of new varieties of genetically compatible organisms.

As the link on the first organism demonstrates, those 'nasty invaders' (as a 'conservationist' link declares) are tasty and edible!

So, instead of a constant negative report for compatible hybrids, we can deliberately control them by applying the Mendelian Bioengineering.

Other 1997 Science’s articles by Wade Roush:

'Living Fossil' Fish Is Dethroned. Wade Roush. Science (Sep. 5, 1997).
The coelecanth is not supported as the nearest living relative of quadrupeds in recent DNA studies.

· A Developmental Biology Summit in the High Country. Wade Roush. Science (Aug. 1, 1997).
New molecular genetic evidence 'indicates that diverse body segments of insects did not evolve as a result of Hox gene duplication as previously thought.'
· Evolutionary biology: Sizing Up Dung Beetle Evolution. Wade Roush. Science (Jul. 11, 1997).
An example of developmental constraints in the evolution of body parts.
· Developing a New View of Evolution. Elizabeth Pennisi and Wade Roush. Science (Jul. 4, 1997).
A new approach in evolutionary biology involves analysis of genes involved in development of body plans.
Related articles:

William L. Perry, Jeffrey L. Feder, Greg Dwyer, and David M. Lodge. Hybrid Zone Dynamics and Species Replacement Between Orconectes Crayfishes in a Northern Wisconsin Lake. Evolution, 55(6):1153–1166. 2000.

Carolyn A. Klocker, & David L. Strayer. Interactions Among an Invasive Crayfish (Orconectes rusticus), a Native Crayfish (Orconectes limosus), and Native Bivalves (Sphaeriidae and Unionidae) Northeastern Naturalist, 2004.

Related definitions:

Hybrid Vigour.

Heterosis.

Other related postings:

Intelligent Design to Generate Biodiversity

MENDELIAN BIOENGINEERING and the Limits to Biochange

Calling Darwin’s Bluff

Adaptive comparisons of cave animals, part 3 (pictures of the troglobitic (cave) crayfish):

In Search of an Intelligent Understanding of Variation in Nature

The Gull Variation

The Three Objectives of Mendelian Bioengineering

Again, I want to close my post with a statement by Jonathan Wells that clarifies why the scientific pursuit of natural limits to biological change are within the framework of “Intelligent Design” and not within the flawed framework of Darwinism, Neo-Darwinism or ‘Evolution’:

"...ID could function as a "metatheory," providing a conceptual framework for scientific research. By suggesting testable hypotheses about features of the world that have been systematically neglected by older metatheories (such as Darwin's), and by leading to the discovery of new..."

Thursday, October 27, 2005

Michael Denton - Evolution: A Theory in Crisis.

Denton, an Australian molecular biologist, provides a comprehensive critique of neo- Darwinian evolutionary theory. In a penultimate chapter, entitled “The Molecular Labyrinth,” he also develops a strong positive case for the design hypothesis based on the integrated complexity of molecular biological systems. As a religiously agnostic scientist, Denton emphasizes that this case for design is based upon scientific evidence and the application of standard forms of scientific reasoning. As Denton explains, while the case for design may have religious implications, “it does not depend upon religious premises.”

On Darwinism II, Interview with Dr. Michael Denton (Approx. 58 min., a RealPlayer video)

Michael Denton's classic book "Evolution: A Theory in Crisis" with more than 60 "polarized comments" at Amazon.com, including the bright comment by Dr. Proctor, a "christian mathematician" from San Marcos, Texas:

"Dr. Michael Denton is clearly not a creationist. That is obvious from his first chapter, "Genesis Rejected". However, he is clearly questioning the truthfulness of macroevolution in any form... Even the simplest of programs (written in C++ for example) are coded by intelligent human beings. They do not appear randomly... Human beings are infinitely more complex than these programs. Thus, it appears likely that we were created by intelligent design."
Then, Gert Korthof's review in full, chapter by chapter with links (updated: 29 July 2005):

http://home.wxs.nl/~gkorthof/kortho18.htm

Observations by John A. Davidson:

http://www.reocities.com/plin9k/sexvsevol.htm

Wednesday, October 26, 2005

Intelligent Design Predicted Transcription to be Proofread

Mike Gene wrote the beautiful article Using ID to Understand the Living World. Mike used Intelligent Design as a working hypothesis to accurately infer something about the realities of the molecular world. He declared:

“...The information flow that occurs within a cell happens at several points. DNA is used to make DNA; DNA is used to make RNA; and RNA is used to make proteins. So goes the classic formulation of the Central Dogma of molecular biology. With information flow comes the issue of fidelity - how faithful is the information transferred? Scientists have long known that proofreading mechanisms exist during DNA replication where nucleotides that are misincorporated during replication are typically removed and replaced with the correct one. Similar proofreading also occurs at the two crucial points of information flow during protein synthesis: the charging of tRNAs and the anticodon-codon interactions of mRNA and tRNA..."

"...I was thinking about proofreading and it occurred to me that an important step of information flow appeared to lack proofreading, that of transcription (where DNA is used to synthesize RNA). Now, I know a few things about transcription, but I could not recall ever hearing about proofreading being associated with RNA polymerase activity (the protein complexes that synthesize RNA). It struck me that this was a great opportunity to use ID..."

"Imagine you need to translate a book from English into German and then German into Chinese. If it was important that this translation was as accurate as possible, you would employ proofreaders at both stages. For example, it would not make much rational sense to employ proofreaders to ensure the German text was accurately translated into Chinese without also using proofreaders during the first step (the English to German translation). It defeats the purpose of carefully scrutinizing the second translation if your first is sloppy."

"Thus, using this logic, I predicted that proofreading should exist during transcription (since I strongly suspect cells, much as they are today, were originally designed by a rational agent(s)). Also, given that the degree of proofreading at the level of protein synthesis was so sophisticated, it would not make sense for a rational agent to not also ensure high fidelity at the level of RNA synthesis."

"With this hypothesis in hand, I could thus go into the lab and design experiments to determine if indeed proofreading occurs during transcription. What if I did this? Well, my prediction would have born out. As it happens, I did a literature search after coming up with this hypothesis and indeed discovered there is some good evidence of proofreading during transcription…" [see reference in original source.]

"… ID is not a sterile hypothesis. ID could have indeed led me to discover proofreading during transcription. It led Harvey to figure out how the circulatory system works. ID is a useful tool. The only reason critics deny this is because they never pick it up, which of course, renders it of no use to them. Yet just because they don't want to use it is no reason why someone else can't pick it up and see how it works."

"…without such high fidelity, autonomous cellular life might not be able to exist (or if it can, it would quickly go extinct). I think proofreading at every important step in information transfer simply underscores the importance of specificity in life processes. If transcription/translation are not proofread, this not only means you increase the likelihood of plugging non-functional cogs into the machine, but you'll might also overwhelm or overtax the chaperone/folding and proteasome/degradative systems."

"Furthermore, the need for such proof-reading might be a reflection of the wide-spread nature of irreducible complexity (IC) in the core biotic processes…"

"…Darwinian selectionist thinking cannot lead to this inference (except in an after the fact manner). Why? Darwinian selection entails only that things "work", not that they be inherently rational or sophisticated. And since I had no reason to think cells cannot work without transcriptional proofreading (prior to finding that it existed), I had no reason to think natural selection had created it…"

"…there is the overall background emphasis of specificity that comes with ID. This is in stark contrast the general emphasis on messiness that comes with Darwinian thinking. The more specified something is, the stronger the design inference. Thus, one looks for examples of extreme specificity to minimize the error of making a false design inference…"

"…What ties translation and transcription? A suspicion that we're dealing with a designed system of information flow. That is, the tie is abstract and conceptual. Thus, I have been reading up on some mechanical engineering texts that outline the process of design. Not too long ago, I finished one that describes, "Function can be described in terms of the logical flow of energy, material, or information." This has significantly colored my thinking, as I have been looking for logical flows (something I would rarely expect from the blind watchmaker). In other words, if something is designed, and the two systems are linked in a more conceptual than structural manner, one expects to find a certain logic that binds them."

"…This was an analogy to highlight the relationship between specificity and logical flow among designed events. It employed "if, then" reasoning. So where did the preset "if" come from? Not from darwinian thinking, as it cannot imply these preset condition. It came from ID. If life is designed, and specificity is a common trace of design (and typically required of designed components) then conceptually, one might expect to see this theme of specificity laid out in a logical flow."

"Nothing in Darwinian theory led anyone to suspect that the RNA polymerase was proofreading; it is not even discussed in the 2004 edition of the Lodish et al, "Molecular Biology of the Cell".


Here, Mike Gene demonstrates how Intelligent Design thinking guided his reasoning, then we can see Intelligent Design generating testable hypotheses.

One expert reader declared,

"...[In] loss of vitamin D signaling, or, under non physiological conditions, where there is overexpression of so-called dominant negative forms of proteins (this occurs during some oncogenic viral infections, and in cancer)... since different classes of transcription factors recruit common cofactors (and ultimately the same RNA polymerase), dominant negative proteins can disrupt transcription controlled by several signal transduction pathways."

"Thank God for proofreading I'd say."

[His quote:] "For many [ID proponents] the main motive is not the foreseen spiritual implications but simply the denuding of what they believe is a contemptible flow of misinformation." -- Thomas Woodward, Doubts About Darwinism

Tuesday, October 25, 2005

Icons of Evolution, the Documentary

Documentary Icons of Evolution (in RealPlayer. Approx. 51 min.)

Many of the most famous “Icons of Evolution”–including Darwin’s “Tree of Life,” finches from the Galapagos Islands, and embryos that look remarkably similar – are based on outdated research and sloppy logic.

Students are being hurt by the failure to present both sides of an emerging scientific debate over Darwin’s theory.

Explore this fascinating new conflict over evolution in the classroom – a conflict based on science, not religion. Learn about the controversy that engulfs one town when a teacher, Roger DeHart, actually tries to tell students that some scientists disagree with Darwin.

From the Galapagos Islands to China, Icons of Evolution (Coldwater Media) will take you on a fast-paced, fascinating journey into one of the most controversial issues in today’s public arena.

Monday, October 24, 2005

The Origin of Intelligent Design

The Origin of Intelligent Design: A brief history of the scientific theory of intelligent design, by Jonathan Witt, Ph.D. Senior Fellow, Discovery Institute. Oct. 1, 2005 (PDF file).

Abstract:

Critics of the theory of intelligent design often assert that it is simply a re-packaged version of creationism, and that it began after the Supreme Court struck down the teaching of creationism in Edwards v. Aguillard in 1987. In reality, the idea of intelligent design reaches back to Socrates and Plato, and the term “intelligent design” as an alternative to blind evolution was used as early as 1897. More recently, discoveries in physics, astronomy, information theory, biochemistry, genetics, and related disciplines during the past several decades provided the impetus for scientists and philosophers of science to develop modern design theory. Many of the central ideas for the theory of intelligent design were already being articulated by scientists and philosophers of science by the early 1980s, well before the Edwards v. Aguillard decision.

Sunday, October 23, 2005

Icons of Evolution Exposed in a Panel Discussion

On The Theory of Evolution (Approx. 115 min.)

In this RealPlayer video, Dr. Jonathan Wells succeeds exposing with confidence “Icons of Evolution” in a panel discussion related to the philosophy and limits of science. The other two panelists include ironic Darwinist Michael Ruse while the final panelist focuses on his own uncertainty. Meeting presented by UCSB as part of the Focus on Origins series.

Saturday, October 22, 2005

Casting Down the Icons of Evolution

The Critical Analysis of Evolution is very important in the same way that the Critical Analysis of Molecular Databases is vital for a better understanding of Biology and it is profitable to prevent errors and mistakes in medicine and pharmacology.

In the next RealPlayer video, Dr. Jonathan Wells, author of the book Icons of Evolution argues that much of what we have been taught about evolution may be wrong. The peppered moth, embryology, Darwin's finches, and the fossil record are all examples used to help illustrate his point in this latest lecture from the Focus on Origins series sponsored by UC Santa Barbara:
http://webcast.ucsd.edu:8080/ramgen/UCSD_TV/6466IcoEvoJonWel.rm

Next, a condensed version of his book:

Jonathan Wells. Survival of the Fakest. The American Spectator - Dec. 2000 /Jan. 2001 (PDF).

Friday, October 21, 2005

A functional entropy model for biological sequences

Durston KK, Chiu DKY. A functional entropy model for biological sequences. Dynamics Of Continuous Discrete And Impulsive Systems, Series B: Applications & Algorithms 2:722-725 Sp. Iss. Si 2005. [also presented at the Proc. 4th Intern. Conf. on Engineering Applications and Computational Algorithms]

This paper introduces functional entropy as a measure of entropy that incorporates functional interpretations corresponding to certain biological functions. A measure of change of functional entropy is defined to measure entropy change between two functional states. We show here two biosequence analysis experiments based on the ankyrin repeat and the Ubx box gene. They show how two related biomolecules with different biological functions can be compared and analyzed. Furthermore, with a given limit on entropy change, intermediaries between states can also be estimated and evaluated.

Durston KK says of this paper:
"Finding novel proteins is unlikely to occur through a series of functional intermediates, and will therefore degenerate to a random walk."

Note: Thanks to Dr. A. Voie for this excellent reference!

An Introduction to Intelligent Design

Thursday, October 20, 2005

"Anonymous" criticizes such Palindromatic discovery

Anonymous wrote:

1- The review or a focused mini review should be synthetic and critical and not just a review of the literature.
My response: I presented my work originally as a review on the subject because it reviews the thousands of sequences in the Genbank containing palindromes.

2- The author seems to have only three papers in PubMed. I don’t think he has the stature to write a review on this subject.
My response: My research field on published sequences quality control is a brand new one and I am proud to provide it... I have seven related publications, as presented in my list..., two of them corresponding to specialized book reviews.

3- This is supposed to be a review but original data is presented.
My response: Yes, because it was a critical review of the palindromes that are present in the Genbank. It is not prohibitive to present original data in a review...

4- The manuscript shows that an EcoRI linker sequence is present in several sequences reported to GenBank, either at their extremities or internally, which characterizes a chimera.
My response: Correct!

5- Although this observation is real…
My response: Real and in need to be corrected, to prevent additional biological and medical errors.

6- ...some key data is missing. Table II, which is supposed to contain all the GenBank sequences, is not available in the URL given by the author.
My response: My note on Table 2 declares that "Note: This table presents the EcoRI related linker ctcgtgccgaattcggcacgag as it appears in Genbank for some human genes. To view the rest of this Table 2 and the presence of the linker in other organisms, refer to URL: http://www.reocities.com/plin9k/t2.htm"
The additional data of the full Table 2 is to add or build up the case.
I wanted to wait until the article was published to release these 150 examples, plus other 850 examples to be linked at the feet of such full Table 2. This will give ...more than 1200 examples at:

http://www.reocities.com/plin9k/1200.zip (Zip Excell)

Also, ...the attachment for the palindromes present especifically in at least 70 Affymetrix samples:

http://www.reocities.com/plin9k/affy70.zip
As I mentioned before, that data is additional data, not key data...

7- The author mentions thousands of sequences in GenBank containing the artifactual sequence. I only found 210 when I blast the sequence given by the author against nr database.
My response: To find the thousands I mention, the original version of my article states that we need to modify the default conditions with the next values (Query conditions: 1000 as the number of descriptions and of alignments and 10^6 as the minimum expected number)...
go to the end of the Blastn page and set the conditions described in my article. If no conditions are set, the results obtained are the ones that _____________ presents... however, as I declare here, the conditions originally used are the ones that need to be set. The conditions used allow us a more expanded (thought not complete) retrieval of palindrome artifacts at http://www.ncbi.nlm.nih.gov/BLAST

For example, I just did a search again few days ago and ...thousands of results are thus obtained [as presented in my article]

8- The great majority of the cases are at the extremities of the sequence, while few are located internally. In the few cases that I looked more carefully, these seem to represent chimeras, as suggested by the author.
My response: Yes, I suggest that when using the conditions described in my paper the number of cases increases, not being "few" cases, as described by ________________.

9- It is interesting that about half of all cases come from few large-scale sequencing projects (Schistosoma japonicum and Tritium aestivum are the most prevalent). I suggest the author to contact the people responsible for these projects and communicate his findings.
My response: Indeed, any researcher using standard sequencing procedures can get and report a palindromic artifact, as demonstrated in my article. As the Excell file depicts, Humans and Mice are the ones with the most abundant artificial palindromic matches, and that's precisely the reason why it is important to alert researchers working with other organisms, of the existance of such artifacts. The importance of the publication of this article is to alert every other researcher using similar methodologies.

Wednesday, October 19, 2005

Three 'Comments' on Palindromes in the Genbank

Dear Dr.
After careful consideration, we regret to inform you that your submission was found Out of Scope of _______________. While this is most probably a disappointment to you and I apologize for the delays in sending you this decision.

We hope you will continue to consider submitting your work to ___________________.
///////////

Dear _____
Many thanks for your inquiry. The editors have discussed your proposal and we do not feel that ________ is the appropriate journal for the publication of this finding. Although the finding will be of interest, we feel that it does not provide the strength of biological advance that we must require for papers in ________ .

I am sorry that we cannot be more positive on this occasion, but hope you appreciate the reasons why we have to maintain such high standards in ________ ... for the moment we have to be very selective in what we publish.

Thanks again for your interest and support.

Best wishes
_________
/////////////////

Dear Dr. __________:
We regret to inform you that your paper was not found acceptable for publication by _____________________.

It was the Editors impression that the changes introduced in the revised manuscript were not sufficient and that essential points raised by the reviewers were not satisfactorily addressed. Therefore, your paper received low priority for publication.

Thank you for giving us the opportunity to consider your paper for publication.

Sincerely yours,

_______________

Tuesday, October 18, 2005

Manuscript on Palindromes

Next is the first e-mail that I received related to my manuscript:

Tue, 18 Oct 2005

Hi Fernando,
Great! and thank you for forwarding the manuscript. Palindromati (original title: palindramatic ;-) will be a surprise for many biologists involved in cloning, PCR or other gDNA-related activities.

Thank you very much and good luck to you.

Palindromati

Palindromati

fdocc at yahoo dot com
Independent Biotechnology.

ABSTRACT

This article describes a family of artificial heterotranscripts (RNA chimaeras) composed by thousands of Genbank sequences containing fragments or the complete EcoRI-like adapter acting as the palindrome linker ctcgtgccgaattcggcacgag, binding together two or more genes that may be produced by different chromosomes. This happens due to current methodologies producing the reported sequences, found in the Genbank, in Affymetrix microarrays, and in many published articles reporting or using those sequences that include the EcoRI-like linker inside coding regions, and/or 5’UTR or 3’UTRs mRNA sites. This EcoRI-like linker and its heterotranscripts are here deemed as experimental artifacts, characterization that can be helpful to prevent errors, both in the studies of molecular mechanisms and in the drug discovery process.
With the Next Figure:
Based on my previous experimental work I demonstrate the lack of expression for the palindromic linker, which means that it is not present in the original tissue but appears after the cloning process, being then an artificial product of the host-vector interaction. The same finding can be demonstrated, as I have done in my original article, by using any of the public results for Affymetrix microarrays based on sequences reported to Genbank.

The Next Figure (which appears bigger in the original article):

Demonstrates that the phenomenon of self-hybridization of the palindromic EcoRI-like linker seems to be blocking the enzymatic digestion.

This article has been uploaded to the Archive of the ISCID, which is the International Society for Complexity, Information and Design:

http://www.iscid.org/boards/ubb-get_topic-f-6-t-000582.html

I have presented additional evidence in an expanded Table as well as in Zipped files with thousands of examples taken from both the Genbank and the Affymetrix DNA-Chips:

http://www.reocities.com/plin9k/t2.htm

Monday, October 17, 2005

Specification: The Pattern That Signifies Intelligence

Specification: The Pattern That Signifies Intelligence. By William A. Dembski. August 15, 2005, version 1.22 [PDF].

In: the Mathematical Foundations of Intelligent Design.

From the Abastract:

"Specification denotes the type of pattern that highly improbable events must exhibit before one is entitled to attribute them to intelligence. This paper analyzes the concept of specification and shows how it applies to design detection... the fundamental question of Intelligent Design (ID) [is]: Can objects, even if nothing is known about how they arose, exhibit features that reliably signal the action of an intelligent cause?"

From the Full Text:

1. Specification as a Form of Warrant

"...For [Alvin] Plantinga, warrant is what turns true belief into knowledge."

"It is not enough merely to believe that something is true; there also has to be something backing up that belief."

"...specification functions like Plantinga’s notion of warrant..."

"...specification is what must be added to highly improbable events before one is entitled to attribute them to design."

"... specification constitutes a probabilistic form of warrant, transforming the suspicion of design into a warranted belief in design... a systematic procedure for sorting through circumstantial evidence for design."

"... You are walking outside and find an unusual chunk of rock. You suspect it might be an arrowhead. Is it truly an arrowhead, and thus the result of design, or is it just a random chunk of rock... we have no direct knowledge of any putative designer and no direct knowledge of how such a designer, if actual, fashioned the item in question. All we see is the pattern exhibited by the item..."

2. Fisherian Significance Testing

"... In Fisher’s approach to testing the statistical significance of hypotheses, one is justified in rejecting (or eliminating) a chance hypothesis provided that a sample falls within a prespecified rejection region (also known as a critical region)... In Fisher’s approach, if the coin lands ten heads in a row, then one is justified rejecting the chance hypothesis."

"... In the applied statistics literature, it is common to see significance levels of .05 and .01. The problem to date has been that any such proposed significance levels have seemed arbitrary, lacking “a rational foundation.”"

"... significance levels cannot be set in isolation but must always be set in relation to the probabilistic resources relevant to an event’s occurrence."

"... essentially, the idea is to make a target so small that an archer is highly unlikely to hit it by chance..."

"Rejection regions eliminate chance hypotheses when events that supposedly happened in accord with those hypotheses fall within the rejection regions."

"... the probability of getting 100 heads in a row is roughly 1 in 10^30, which is drastically smaller than 1 in 9 million. Within Fisher’s theory of statistical significance testing, a prespecified event of such small probability is enough to disconfirm the chance hypothesis."

3. Specifications via Probability Densities

"... within Fisher’s approach to hypothesis testing the probability density function f is used to identify rejection regions that in turn are used to eliminate chance... [function f is nonnegative]... Since f cannot fall below zero, we can think of the landscape as never dipping below sea-level."
"In this last example we considered extremal sets... at which the probability density function concentrates minimal probability."

"... Although the combinatorics involved with the multinomial distribution are complicated (hence the common practice of approximating it with continuous probability distributions like the chi-square distribution), the reference class of possibilities Omega, though large, is finite... [and its cardinality, i.e., the number of its elements, is well-defined (its order of magnitude is around 10^33)]"

4. Specifications via Compressibility

"... The problem algorithmic information theory [the Chaitin-Kolmogorov-Solomonoff theory] seeks to resolve is this: Given probability theory and its usual way of calculating probabilities for coin tosses, how is it possible to distinguish these sequences in terms of their degree of randomness?"

"Probability theory alone is not enough."

"... Chaitin, Kolmogorov, and Solomonoff supplemented conventional probability theory with some ideas from recursion theory, a subfield of mathematical logic that provides the theoretical underpinnings for computer science... a string of 0s [zeroes] and 1s [ones] becomes increasingly random as the shortest computer program that generates the string increases in length."

"For the moment, we can think of a computer program as a short-hand description of a sequence of coin tosses."

Thus, the sequence (N):

11111111111111111111111111111111111111111111111111
11111111111111111111111111111111111111111111111111

is not very random because it has a very short description, namely,

repeat ‘1’ a hundred times."

"... we are interested in the shortest descriptions since any sequence can always be described in terms of itself."

"The sequence (H):

11111111111111111111111111111111111111111111111111
00000000000000000000000000000000000000000000000000

is slightly more random than (N) since it requires a longer description, for example,

repeat ‘1’ fifty times, then repeat ‘0’ fifty times."

"... the sequence (A):

10101010101010101010101010101010101010101010101010
10101010101010101010101010101010101010101010101010

has a short description,

repeat ‘10’ fifty times."

"The sequence (R) [see below], on the other hand, has no short and neat description (at least none that has yet been discovered). For this reason, algorithmic information theory assigns it a higher degree of randomness than the sequences (N), (H), and (A)."

"Since one can always describe a sequence in terms of itself, (R) has the description"

copy ‘11000011010110001101111111010001100011011001110111
00011001000010111101110110011111010010100101011110’
.

"Because (R) was constructed by flipping a coin, it is very likely that this is the shortest description of (R)."

"It is a combinatorial fact that the vast majority of sequences of 0s and 1s have as their shortest description just the sequence itself."

"... most sequences are random in the sense of being algorithmically incompressible."

"... the collection of nonrandom sequences has small probability among the totality of sequences so that observing a nonrandom sequence is reason to look for explanations other than chance."

"... Kolmogorov even invoked the language of statistical mechanics to describe this result, calling the random sequences high entropy sequences, and the nonrandom sequence low entropy sequences."

"... the collection of algorithmically compressible (and therefore nonrandom) sequences has small probability among the totality of sequences, so that observing such a sequence is reason to look for explanations other than chance."

5. Prespecifications vs. Specifications

"... specifications are patterns delineating events of small probability whose occurrence cannot reasonably be attributed to chance."

"... we did see some clear instances of patterns being identified after the occurrence of events and yet being convincingly used to preclude chance in the explanation of those events (cf. the rejection regions induced by probability density functions as well as classes of highly compressible bit strings...) ... for such after-the-event patterns, some additional restrictions needed to be placed on the patterns to ensure that they would convincingly eliminate chance..."

"... before-the-event patterns, which we called prespecifications, require no such restrictions."

"... prespecified events of small probability are very difficult to recreate by chance."

"It’s one thing for highly improbable chance events to happen once. But for them to happen twice is just too unlikely. The intuition here is the widely accepted folk-wisdom that says “lightning doesn’t strike twice in the same place.” "

"... this intuition is taken quite seriously in the sciences. It is, for instance, the reason origin-of-life researchers tend to see the origin of the genetic code as a one-time event. Although there are some variations, the genetic code is essentially universal. Thus, for the same genetic code to emerge twice by undirected material mechanisms would simply be too improbable."

"With the sequence (R, see above) treated as a prespecification, its chance occurrence is not in question but rather its chance reoccurrence..."

"... specifications... patterns that nail down design and therefore that inherently lie beyond the reach of chance."

"Are there patterns that, if exhibited in events, would rule out their original occurrence by chance?"

"... the answer is yes, consider the following sequence (again, treating “1” as heads and “0” as tails; note that the designation pseudo-R here is meant to suggest pseudo-randomness):

(pseudo-R)
01000110110000010100111001011101110000000100100011
01000101011001111000100110101011110011011110111100."

"... how will you determine whether (pseudo-R) happened by chance?"

"One approach is to employ statistical tests for randomness."

"... to distinguish the truly random from the pseudo-random sequences. In a hundred coin flips, one is quite likely to see six or seven ... repetitions [see Note 21]"

[Note 21: “The proof is straightforward: In 100 coin tosses, on average half will repeat the previous toss, implying about 50 two-repetitions. Of these 50 two-repetitions, on average half will repeat the previous toss, implying about 25 three-repetitions. Continuing in this vein, we find on average 12 four-repetitions, 6 five-repetitions, 3 six-repetitions, and 1 seven-repetition. See Ivars Peterson, The Jungles of Randomness: A Mathematical Safari (New York: Wiley, 1998), 5.]

"On the other hand, people concocting pseudo-random sequences with their minds tend to alternate between heads and tails too frequently."

"Whereas with a truly random sequence of coin tosses there is a 50 percent chance that one toss will differ from the next, as a matter of human psychology people expect that one toss will differ from the next around 70 percent of the time."

"... after three or four repetitions, humans trying to mimic coin tossing with their minds tend to think its time for a change whereas coins being tossed at random suffer no such misconception."

"... (R) resulted from chance because it represents an actual sequence of coin tosses. What about (pseudo-R)?"

"... (pseudo-R) is anything but random. To see this, rewrite this sequence by inserting vertical strokes as follows:

(pseudo-R)
01000110110000010100111001011101110000000100100011
01000101011001111000100110101011110011011110111100

""By dividing (pseudo-R) this way it becomes evident that this sequence was constructed simply by writing binary numbers in ascending lexicographic order, starting with the one-digit binary numbers (i.e., 0 and 1), proceeding to the two-digit binary numbers (i.e., 00, 01, 10, and 11), and continuing until 100 digits were recorded."

"... (pseudo-R), when continued indefinitely, is known as the Champernowne sequence and has the property that any N-digit combination of bits appears in this sequence with limiting frequency 2^-N."

“D. G. Champernowne identified this sequence back in 1933."

"The key to defining specifications and distinguishing them from prespecifications lies in understanding the difference between sequences such as (R) and (pseudo-R)."

"The coin tossing events signified by (R) and (pseudo-R) are each highly improbable [to be exactly reproduced]."

"... (R), for all we know [did] arise by chance whereas (pseudo-R) cannot plausibly be attributed to chance."

6. Specificity

"The crucial difference between (R) and (pseudo-R) is that (pseudo-R) exhibits a simple, easily described pattern whereas (R) does not."

"To describe (pseudo-R), it is enough to note that this sequence lists binary numbers in increasing order."

"By contrast, (R)cannot, so far as we can tell, be described any more simply than by repeating the sequence."

"Thus, what makes the pattern exhibited by (pseudo-R) a specification is that the pattern is easily described but the event it denotes is highly improbable and therefore very difficult to reproduce by chance."

"It’s this combination of pattern simplicity (i.e., easy description of pattern) and event-complexity (i.e., difficulty of reproducing the corresponding event by chance) that makes the pattern exhibited by (pseudo-R) — but not (R) — a specification [see Note 23]"

[Note 23: It follows that specification is intimately connected with discussions in the self-organizational literature about the “edge of chaos,” in which interesting self-organizational events happen not where things are completely chaotic (i.e., entirely chance-driven and thus not easily describable)... See Roger Lewin, Complexity: Life at the Edge of Chaos, 2nd ed. (Chicago: University of Chicago Press, 2000)]

"... “displaying an even face” describes a pattern that maps onto a (composite) event in which a die lands either two, four, or six."

"... If “bidirectional,” “rotary,” “motor-driven,” and “propeller” are basic concepts, then the molecular machine known as the bacterial flagellum can be characterized as a 4-level concept of the form “bidirectional rotary motor-driven propeller.”

"Now, there are approximately N = 10^20 concepts of level 4 or less, which therefore constitute the specificational resources relevant to characterizing the bacterial flagellum."

"... Hitting large targets by chance is not a problem. Hitting small targets by chance can be."

"... putting the logarithm to the base 2... has the effect of changing scale and directionality, turning probabilities into number of bits and thereby making the specificity a measure of information... This logarithmic transformation therefore ensures that the simpler the patterns and the smaller the probability of the targets they constrain, the larger specificity."

"To see that the specificity so defined corresponds to our intuitions about specificity in general, think of the game of poker and consider the following three descriptions of poker hands: ... the probability of one pair far exceeds the probability of a full house which, in turn, far exceeds the probability of a royal flush. Indeed, there are only 4 distinct royal-flush hands but 3744 distinct full-house hands and 1,098,240 distinct single-pair hands... when we take the negative logarithm to the base 2, the specificity associated with the full house pattern will be about 10 less than the specificity associated with the royal flush pattern. Likewise, the specificity of the single pair pattern will be about 10 less than that of the full house pattern."

"... consider the following description of a poker hand: “four aces and the king of diamonds.” ...this description is about as simple as it can be made. Since there is precisely one poker hand that conforms to this description, its probability will be one-fourth the probability of getting a royal flush."

"... specificity... includes not just absolute specificity but also the cost of describing the pattern in question. Once this cost is included, the specificity of “royal flush” exceeds than the specificity of “four aces and the king of diamonds.”

7. Specified Complexity

"... the following example from Dan Brown’s... The Da Vinci Code. The heroes, Robert Langdon and Sophie Neveu, find themselves in an ultra-secure, completely automated portion of a Swiss bank (“the Depository Bank of Zurich”). Sophie’s grandfather, before dying, had revealed the following ten digits separated by hyphens: 13-3-2-21-1-1-8-5

... Sophie said, frowning. “Looks like we only get one try.” Standard ATM machines allowed users three attempts to type in a PIN before confiscating their bank card. This was obviously no ordinary cash machine...

She finished typing the entry and gave a sly smile. “Something that appeared random but was not.”
"[The digits that Sophie’ grandfather made sure she received posthumously, namely, 13-3-2-21-1-1-8-5, can be rearranged as 1-1-2-3-5-8-13-21, which are the first eight numbers in the famous Fibonacci sequence. In this sequence, numbers are formed by adding the two immediately preceding numbers. The Fibonacci sequence has some interesting mathematical properties and even has applications to biology.
Ref. 30 "For the mathematics of the Fibonacci sequence, see G. H. Hardy and E. M. Wright, An Introduction to the Theory of Numbers, 5th ed. (Oxford: Clarendon Press, 1979), 148–153. For an application of this sequence to biology, see Ian Stewart, Life’s Other Secret: The New Mathematics of the Living World (New York: Wiley, 1998), 122–132."]

"Robert and Sophie punch in the Fibonacci sequence 1123581321 and retrieve the crucial information they are seeking."

"This sequence, if produced at random (i.e., with respect to the uniform probability distribution denoted by H), would have probability 10^-10, or 1 in 10 billion."

"... for typical ATM cards, there are usually sixteen digits, and so the probability is typically on the order of 10^-15 (not 10^-16 because the first digit is usually fixed; for instance, Visa cards all begin with the digit “4”)."

"This bank wants only customers with specific information about their accounts to be able to access those accounts. It does not want accounts to be accessed by chance."

"... If, for instance, account numbers were limited to three digits, there would be at most 1,000 different account numbers, and so, with millions of users, it would be routine that accounts would be accessed accidentally"

"Theoretical computer scientist Seth Lloyd has shown that 10^120 constitutes the maximal number of bit operations that the known, observable universe could have performed... This number sets an upper limit on the number of agents that can be embodied in the universe and the number of events that, in principle, they can observe."

Accordingly, for any context of inquiry in which S might be endeavoring to determine whether an event that conforms to a pattern T happened by chance, M·N will be bounded above by 10^120. We thus define the specified complexity of T given H (minus the tilde and context sensitivity) as [From Note 46, “Lloyd’s approach [10^120] is more elegant and employs deeper insights into physics. In consequence, his approach yields a more precise estimate for the universal probability bound.”]

"... it is enough to note two things:

(1) there is never any need to consider replicational resources M·N that exceed 10^120 (say, by invoking inflationary cosmologies or quantum many-worlds) because to do so leads to a wholesale breakdown in statistical reasoning, and that’s something no one in his saner moments is prepared to do (for the details about the fallacy of inflating one’s replicational resources beyond the limits of the known, observable universe, see my article (in PDF) “The Chance of the Gaps”[Dembski, W.A. “The Chance of the Gaps,” in Neil Manson, ed., God and Design: The Teleological Argument and Modern Science (London: Routledge, 2002), 251–274.])

(2) ... the elimination of chance only requires a single semiotic agent who has discovered the pattern in an event that unmasks its non-chance nature. Recall the Champernowne sequence discussed in sections 5 and 6 (i.e., (pseudo-R)). It doesn’t matter if you are the only semiotic agent in the entire universe who has discovered its binary-numerical structure.

"That discovery is itself an objective fact about the world... that sequence would not rightly be attributed to chance precisely because you were the one person in the universe to appreciate its structure."

"... Since specifications are those patterns that are supposed to underwrite a design inference, they need, minimally, to entitle us to eliminate chance. Since to do so, it must be the case that
“… we therefore define specifications as any patterns T that satisfy this inequality... specifications are those patterns whose specified complexity is strictly greater than 1."

"Note that this definition automatically implies a parallel definition for context-dependent specifications...”

“Such context-dependent specifications are widely employed in adjudicating between chance and design (cf. the Da Vinci Code example). Yet, to be secure in eliminating chance and inferring design on the scale of the universe, we need the context-independent form of specification.”

“As an example of specification and specified complexity in their context-independent form, let us return to the bacterial flagellum… “bidirectional rotary motor-driven propeller.” This description corresponds to a pattern T... given a natural language (English) lexicon with 100,000 (= 10^5) basic concepts... we estimated the complexity of this pattern at approximately… 10^20"

“These preliminary indicators point to T’s specified complexity being greater than 1 and to T in fact constituting a specification.”

8. Design Detection

“Having defined specification, I want next to show how this concept works in eliminating chance and inferring design."

"Inferring design by eliminating chance is an old problem."

"Almost 300-years ago, the mathematician Abraham de Moivre addressed it as follows:

“… We may imagine Chance and Design to be, as it were, in Competition with each other, for the production of some sorts of Events, and may calculate what Probability there is, that those Events should be rather owing to one than to the other. To give a familiar Instance of this, Let us suppose that two Packs of Piquet-Cards being sent for, it should be perceived that there is, from Top to Bottom, the same Disposition of the Cards in both packs; let us likewise suppose that, some doubt arising about this Disposition of the Cards, it should be questioned whether it ought to be attributed to Chance, or to the Maker’s Design: In this Case the Doctrine of Combinations decides the Question; since it may be proved by its Rules, that there are the odds of above 263130830000 Millions of Millions of Millions of Millions to One, that the Cards were designedly set in the Order in which they were found” [Abraham de Moivre, The Doctrine of Chances (1718; reprinted New York: Chelsea, 1967), v.]
“... de Moivre requires a highly improbable prespecified event in which one ordered deck of cards rules out the chance reoccurrence of another with the same order. In particular, he places chance and design in competition so that the defeat of chance on the basis of improbability leads to the victory of design."

"Resolving this competition between chance and design is the whole point of specification.”

“... specified complexity is an adequate tool for eliminating individual chance hypotheses.”

Nor is it reason to be skeptical of a design inference based on specified complexity.”

“... in the Miller-Urey experiment, various compounds were placed in an apparatus, zapped with sparks to simulate lightning, and then the product was collected in a trap. Lo and behold, biologically significant chemical compounds were discovered, notably certain amino acids."

"In the 1950s, when this experiment was performed, it was touted as showing that a purely chemical solution to the origin of life was just around the corner. Since then, this enthusiasm has waned because such experiments merely yield certain rudimentary building blocks for life. No experiments since then have shown how these building blocks could, by purely chemical means (and thus apart from design), be built up into complex biomolecular systems needed for life (like proteins and multiprotein assemblages, to say nothing of fully functioning cells)...”

“... if large probabilities vindicate chance and defeat design, why shouldn’t small probabilities do the opposite — vindicate design and defeat chance? Indeed, in many special sciences, everything from forensics to archeology to SETI (the Search for Extraterrestrial Intelligence), small probabilities do just that."

"Objections only get raised against inferring design on the basis of such small probability, chance elimination arguments when the designers implicated by them are unacceptable to a materialistic worldview, as happens at the origin of life, whose designer could not be an intelligence that evolved through purely materialistic processes."

"Parity of reasoning demands that if large probabilities vindicate chance and defeat design, then small probabilities should vindicate design and defeat chance."

"The job of specified complexity is to marshal these small probabilities in a way that convincingly defeats chance and vindicates design."

“At this point, critics of specified complexity raise two objections. First, they contend that because we can never know all the chance hypotheses responsible for a given outcome, to infer design because specified complexity eliminates a limited set of chance hypotheses constitutes an argument from ignorance. But this criticism is misconceived. The argument from ignorance, also known as the appeal to ignorance or by the Latin argumentum ad ignorantiam, is

“... the fallacy of arguing that something must be true because nobody can prove it false or, alternatively, that something must be false because nobody can prove it true. Such arguments involve the illogical notion that one can view the lack of evidence about a proposition as being positive evidence for it or against it. But lack of evidence is lack of evidence, and supports no conclusion. An example of an appeal to ignorance: Styrofoam cups must be safe; after all, no studies have implicated them in cancer. The argument is fallacious because it is possible that no studies have been done on those cups or that what studies have been done have not focused on cancer (as opposed to other diseases).”
“In eliminating chance and inferring design, specified complexity is not party to an argument from ignorance. Rather, it is underwriting an eliminative induction.”

Eliminative inductions argue for the truth of a proposition by actively refuting its competitors (and not, as in arguments from ignorance, by noting that the proposition has yet to be refuted). Provided that the proposition along with its competitors form a mutually exclusive and exhaustive class, eliminating all the competitors entails that the proposition is true. (Recall Sherlock Holmes’s famous dictum:

When you have eliminated the impossible, whatever remains, however improbable, must be the truth.”
“This is the ideal case, in which eliminative inductions in fact become deductions."

"But eliminative inductions can be convincing without knocking down every conceivable alternative, a point John Earman has argued effectively. Earman has shown that eliminative inductions are not just widely employed in the sciences but also indispensable to science..."

"... the other objection, namely, that we must know something about a designer’s nature, purposes, propensities, causal powers, and methods of implementing design before we can legitimately determine whether an object is designed. I refer to the requirement that we must have this independent knowledge of designers as the independent knowledge requirement. This requirement, so we are told, can be met for materially embodied intelligences but can never be met for intelligences that cannot be reduced to matter, energy, and their interactions."

"By contrast, to employ specified complexity to infer design is to take the view that objects, even if nothing is known about how they arose, can exhibit features that reliably signal the action of an intelligent cause."

“To see that the independent knowledge requirement, as a principle for deciding whether something is designed, is fundamentally misguided, consider the following admission by Elliott Sober, who otherwise happens to embrace this requirement:

“To infer watchmaker from watch, you needn’t know exactly what the watchmaker had in mind; indeed, you don’t even have to know that the watch is a device for measuring time. Archaeologists sometimes unearth tools of unknown function, but still reasonably draw the inference that these things are, in fact, tools.”
"Sober’s remark suggests that design inferences may look strictly to features of designed objects and thus presuppose no knowledge about the characteristics of the designer.”

"...what if the designer actually responsible for the object brought it about by means unfathomable to us (e.g., by some undreamt of technologies)? This is the problem of multiple realizability, and it undercuts the independent knowledge requirement because it points up that what leads us to infer design is not knowledge of designers and their capabilities but knowledge of the patterns exhibited by designed objects (a point that specified complexity captures precisely).”

"This last point underscores another problem with the independent knowledge requirement, namely, what I call the problem of inductive regress."

"Suppose... one wants to argue that independent knowledge of designers is the key to inferring design… Consider now some archeologists in the field who stumble across an arrowhead. How do they know that it is indeed an arrowhead and thus the product of design? What sort of archeological background knowledge had to go into their design hypothesis? Certainly, the archeologists would need past experience with arrowheads. But how did they recognize that the arrowheads in their past experience were designed? Did they see humans actually manufacture those arrowheads? …"

Our ability to recognize design must therefore arise independently of induction and therefore independently of any independent knowledge requirement about the capacities of designers. In fact, it arises directly from the patterns in the world that signal intelligence, to wit, from specifications.”

“Another problem with the independent knowledge requirement is that it hinders us from inferring design that outstrips our intellectual and technological sophistication. I call this the problem of dummied down design: the independent knowledge requirement limits our ability to detect design to the limits we impose on designers. But such limits are artificial.”

“Suppose, for instance, that the molecular biology of the cell is in fact intelligently designed. If so, it represents nanotechnology of a sophistication far beyond anything that human engineers are presently capable of or may ever be capable of.”

“By the independent knowledge requirement, we have no direct experience of designers capable of such design work. Thus, even if system after molecular biological system exhibited high specified complexity, the independent evidence requirement would prevent us from recognizing their design and keep us wedded to wholly inadequate materialistic explanations of these systems.”

“… Should we now think that life at key moments in its history was designed?”

“... it is a necessary condition, if a design inference is to hold, that all relevant chance hypotheses be eliminated.”

“... unknown chance hypotheses have no epistemic significance in defeating design.”

"... specified complexity has rendered all relevant chance alternatives inviable, chance as such is eliminated and design can no longer be denied.”

ACKNOWLEDGMENT:

I want to thank Jay Richards for organizing a symposium on design reasoning at Calvin College back in May, 2001 as well as for encouraging me to write up my thoughts about specification that emerged from this symposium… I also want to thank Rob Koons, Robin Collins, Tim and Lydia McGrew, and Del Ratzsch for their useful feedback at this symposium. Stephen Meyer and Paul Nelson were also at this symposium. They have now tracked my thoughts on specification for almost fifteen years and have been my best conversation partners on this topic. Finally, I wish to thank Richard Dawkins, whose remarks about specification and complexity in The Blind Watchmaker first got me thinking (back in 1989) that here was the key to eliminating chance and inferring design. As I’ve remarked to him in our correspondence (a correspondence that he initiated), “Thanks for all you continue to do to advance the work of intelligent design. You are an instrument in the hands of Providence however much you rail against it.”
Addendum 1: Note to Readers or TDI & NFL

“By separating off prespecifications from specifications, the account of specifications becomes much more straightforward. With specifications, the key to overturning chance is to keep the descriptive complexity of patterns low [see Note 24]”

“…descriptive complexity immediately confers conditional independence… [see Note 24]”

[Note 24: There is a well-established theory of descriptive complexity, which takes as its point of departure Chaitin-Kolmogorov-Solomonoff theory of bit-string compressibility, namely, the theory of Minimal Description Length (MDL). The fundamental idea behind MDL is that order in data can be used to compress the data, i.e., to describe it using fewer symbols than needed to describe the data literally.” See http:///(last accessed June 17, 2005)]

“…prespecifications need not be descriptively simple."

"Think of a coin that’s flipped 1,000 times. The pattern it exhibits will (in all likelihood) be unique in the history of coin tossing and will not be identifiable apart from the actual event of flipping that coin 1,000 times. Such a pattern, if a prespecification, will be tractable but it will not be descriptively simple.”

“On the other hand, a Champernowne sequence of length 1,000 can be readily constructed on the basis of a simple number-theoretic scheme. The underlying pattern here, in virtue of its descriptive simplicity, is therefore tractable with respect to information that is conditionally independent of any actual coin tossing event.”

“… in the treatment of specification given here, we have a universal probability bound of 10^-120 …probability bounds are better (i.e., more useful in scientific applications) the bigger they are — provided, of course that they truly are universal [see next equation]”

“… instead of a static universal probability bound of 10^-150, we now have a dynamic one [the previous equation]… that varies with the specificational resources… and thus with the descriptive complexity of T [see Note 24, above]”

For many design inferences that come up in practice, it seems safe to assume that [the denominator of the previous equation] will not exceed 10^30 (for instance, in section 7 a very generous estimate for the descriptive complexity of the bacterial flagellum came out to 10^20) [see Note 24, above]”

From page 17:

“Each S can therefore rank order these patterns in an ascending order of descriptive complexity, the simpler coming before the more complex, and those of identical complexity being ordered arbitrarily. Given such a rank ordering, it is then convenient to define [the denominator of the previous equation] as follows:

the number of patterns for which S’s semiotic description of them is at least as simple as S’s semiotic description of T [see Note 26].”

“…Think of specificational resources as enumerating the number of tickets sold in a lottery: the more lottery tickets sold, the more likely someone is to win the lottery by chance.”

[Note 26: In characterizing how we eliminate chance and infer design, I’m providing what philosophers call a “rational reconstruction.” That is, I’m providing a theoretical framework for something we humans do without conscious attention to theoretical or formal principles. As I see it, semiotic agents like us tacitly assess the descriptive complexity of patterns on the basis of their own personal background knowledge. It is an interesting question, however, whether the complexity of patterns need not be relativized to such agents. Robert Koons considers this possibility by developing a notion of ontological complexity. See Robert Koons, “Are Probabilities indispensable to the Design Inference,” Progress in Complexity, Information, and Design 1(1) (2002): available online at http://www.iscid.org/papers/Koons_AreProbabilities_112701.pdf (last accessed June 21, 2005)]
“… as a rule of thumb, 10^-120 / 10^30 = 10^-150 can still be taken as a reasonable (static) universal probability bound… the burden is on the design critic to show either that the chance hypothesis H is not applicable or that [the denominator of the previous equation] is much greater than previously suspected… for practical purposes, taking 10^-150 as a universal probability bound still works…”

“… In my present treatment, specified complexity… is now not merely a property but an actual number calculated by a precise formula [the first equation posted here]”

“This number can be negative, zero, or positive. When the number is greater than 1, it indicates that we are dealing with a specification [the second equation posted here].”

Addendum 2: Bayesian Methods

“The approach to design detection that I propose eliminates chance hypotheses when the probability of matching a suitable pattern (i.e., specification) on the basis of these chance hypotheses is small and yet an event matching the pattern still happens (i.e., the arrow lands in a small target). This eliminative approach to statistical rationality, as epitomized in Fisher’s approach to significance testing, is the one most widely employed in scientific applications.”

“Nevertheless, there is an alternative approach to statistical rationality that is at odds with this eliminative approach. This is the Bayesian approach, which is essentially comparative rather than eliminative, comparing the probability of an event conditional on a chance hypothesis to its probability conditional on a design hypothesis, and preferring the hypothesis that confers the greater probability. I’ve argued at length elsewhere that Bayesian methods are inadequate for drawing design inferences.”

Among the reasons I’ve given is the need to assess prior probabilities in employing these methods, the concomitant problem of rationally grounding these priors, and the lack of empirical grounding in estimating probabilities conditional on design hypotheses.”

“… the most damning problem facing the Bayesian approach to design detection, namely, that it tacitly presupposes the very account of specification that it was meant to preclude.”

Bayesian theorists see specification as an incongruous and dispensable feature of design inferences. For instance, Timothy and Lydia McGrew, at a symposium on design reasoning, dismissed specification as having no “epistemic relevance.” “

“… the Bayesian approach to statistical rationality is parasitic on the Fisherian approach and can properly adjudicate only among competing hypotheses that the Fisherian approach has thus far failed to eliminate.”

“In particular, the Bayesian approach offers no account of how it arrives at the composite events (qua targets qua patterns qua specifications) on which it performs a Bayesian analysis. The selection of such events is highly intentional and, in the case of Bayesian design inferences, presupposes an account of specification."

"Specification’s role in detecting design, far from being refuted by the Bayesian approach, is therefore implicit throughout Bayesian design inferences.”