Computer Models of Evolution
The evoulution of the computer and Man. What's the difference between the process of evolution in a computer and the process of evolution outside
the computer? The entities that are being evolved are made of different stuff, but the process is identical.... These abstract computer processes
make it possible to pose and answer questions about evolution that are not answerable if all one has to work with is the fossil record and fruit flies.
— Christopher G. Langton (1)
Any theory holding that life originates on Earth de novo from nonliving chemicals has problems for which computers provide a good metaphor. The
problems are in two categories, hardware and software. For life as for computers, both are required, together.
After eukaryotic cellular life has become established, in this metaphor, the machinery for creating new biological hardware is in place, and the
remaining problem is one of software only. How does the genetic programming for new evolutionary features get written and installed?
This aspect of the problem of evolution is a good one to focus on because computers are everywhere and can be readily observed. We can ask the
same question about real computers: how do new computer programs get written and installed? Of course, the answer is that computer
programmers write the programs and computer users install them.
But Darwinism holds that during the course of evolution there were no programmers for genetic programs: the process was blind, self-driven. An
analogous process in the world of computers would cause new computer programs or subroutines to appear spontaneously in the traffic of
computer code being copied and transferred. If a spontaneous new computer program or subroutine somehow became able to replicate itself, it
would have taken a significant step toward "life." If, subsequently, it accrued other advantages, like concealment, it would have a "survival" advantage
in the world of computer traffic. From there, by analogy with Darwinism, it could grow and multiply and have properties similar to life. Does this ever
Alternatively, it should be possible for scientists to artificially create a computer "environment" in which the evolution of computer programs could
occur. Parameters governing the mutation and recombination rates could be optimized for the evolution of new programs. At the lightning speed of
modern computers, jillions of trials could be run to see if randomness coupled with any nonteleological iterative process can ever write computer
programs with genuinely new functions. Has this been done?
The Blind Watchmaker
Nothing in my biologist's intuition, nothing in my 20 years' experience of programming computers, and nothing in my wildest dreams prepared me
for what emerged on the screen.... With a wild surmise, I began to breed, generation after generation, from whichever child looked most like an
insect. My incredulity grew in parallel with the evolving resemblance.... I still cannot conceal from you my feeling of exultation as I first watched these
exquisite creatures emerging before my eyes. I distinctly heard the triumphal opening chords of Also sprach Zarathustra (the ‘2001 theme') in my
mind. — Richard Dawkins (2)
Richard Dawkins has written several computer programs which function, he says, like evolution. Computers today are powerful and can run the
programs very quickly. This speed enables Dawkins to compress a single generation, or computer trial, into a fraction of a second. In his widely
acclaimed 1987 book The Blind Watchmaker, from which the above quote comes, he tells about several such programs. (Photos of Dawkins and
Biomorphs come from The Richard Dawkins Unofficial Website.)
One of Dawkins' programs begins with a random string of letters and creates a sentence. The sentence is from Shakespeare, "METHINKS IT IS
LIKE A WEASEL." The evolution takes only 64, 43, or 41 generations in different computer trials. Dawkins acknowledges that the chance of that short
sentence getting produced in a given random trial is, "about 1 in 10,000 million million million million million million." But, he continues, with
"cumulative selection," the thing becomes doable. Each time a random computer trial happens to produce a correct letter in a slot, that letter is
preserved by cumulative selection (p 46-50).
There is a problem with using Dawkins's scheme as an analogy for evolution. In order for there to be such a thing as a correct letter, the complete
sentence has to already exist. In real life, this would require evolution to be teleological, that is, to have a prescribed goal. Teleology in nature is the
very thing Darwinists abhor. Random mutations cannot have any prescribed goal. For life to evolve this way, what preexisting model is it emulating?
Alternatively, if there is no model in Dawkins's computer, how is the sentence that is only 61 percent wrong, favored over the one that is 86 percent
wrong? How is "MDLDMNLS ITJISWHRZREZ MECS P," better than "WDLTMNLT DTJBSWIRZREZLMQCO P?" After presenting this idea, Dawkins
then admits, "Life isn't like that" (p 50).
Dawkins also uses the computer to generate artificial creatures he calls "Biomorphs." He begins with some stick figures on the computer screen.
He allows a few variables to change at random, within prescribed parameters. This changes the shapes of the stick figures. The resulting creatures
show some variety, and Dawkins is good at naming them. The evolution they undergo is analogous to biological evolution, he says. So he offers his
creatures as further evidence that chance can write new genetic programs. His enthusiasm for the Biomorph program is evident in the quotation at
the beginning of this section.
Dawkins acknowledges that he uses artificial selection to guide the process. His creatures are tightly constrained by his Biomorph software and
completely dependent on the computer's operating software. Deleterious mutations are not possible. The ratio of actual to possible creatures in his
genetic scheme is one to one: every sequence makes a viable creature. He achieves the "evolution" by adjusting only a few variables within narrow
ranges. So the only changes that occur in the creatures are those whose potential is already available in the program originally. The evolution that he
is able to simulate is, at most, microevolution.
At the back of The Blind Watchmaker is an order form that can be used to obtain a copy of Dawkins's program, "The Blind Watchmaker Evolution
Simulation Software," for $10.95. Do you think that the installation instructions say, "Don't be careful when you install this program, and don't make a
backup copy of your system first, because mistakes are the way things get better?"
No. If chance were able to write improvements to computer programs we would know about it by now. "Hey, thanks for the freeware spreadsheet
program you copied and gave me. By the way, due to some chance errors in the copying, I ended up with a version 3.1 copy, from your version 3.0
original. I'll be able to do mortgage tables. Isn't it great?" "Yeah, I've heard of that before."
No, the best outcome that ever follows the exchange of computer programs is that the programs work as expected. And what often happens is more
like, "Hey, thanks for the freeware spreadsheet program you copied and gave me. But after I loaded it, everything crashed. Can you come over and
help me?" Maybe another piece of software is required for the spreadsheet to work on the new computer. Or maybe the other necessary software is
already loaded, but locked up by some previous programming. Tinkering is often required.
How would Richard Dawkins's artificial creatures work if you allowed chance mutations to affect the "Blind Watchmaker" software, or the computer's
operating programs? Would you guess that Dawkins himself carefully makes backup copies of his programs, lest something should, by chance,
alter them? If Dawkins had to conduct evolutionary simulations with operating and applications programs that had been slightly randomized, what
music would he hear then?
The Santa Fe Institute
There is a theory that deals with the application of computers to evolution. It's sometimes called the theory of complex systems, or complexity theory.
Ilya Prigogine is considered one of its founders. The theory is hard to summarize; it is perhaps misleading to even describe complexity theory as a
single theory. Regardless of how it is characterized, since 1984, complexity theory has had an institutional home at the Santa Fe Institute in Santa
Fe, New Mexico.
Physicist Murray Gell-Mann, one of the founders of the Santa Fe Institute and cochair of its Science Board, describes complexity theory by saying that
people in this field work from the top down, whereas scientists usually work by "reductionism" from the bottom up. Gell-Mann thinks both
approaches are needed. The hallmark of complexity theory is the extensive use of information theory and computers to model the behavior of
notoriously complex problems such as economic markets, protein folding, and evolution. In 1994 Gell-Mann published a book, The Quark and the
Jaguar (3), largely about this field. One might expect that Gell-Mann's book would describe some examples, if there are any, of computer programs
that can independently evolve to more organized forms. He briefly mentions Richard Dawkins's Biomorphs program without endorsing it. "In real
biological evolution there is no designer in the loop" (p 318).
The closest possibility Gell-Mann discusses is a program called Tierra, written by Thomas S. Ray. In this program there is a standard "ancestor," an
"organism" consisting of eighty computer instructions. From it other organisms descend. Mutations are introduced into these descendants at rate
about a million times higher than the average mutation rate in eukaryotes. When the computer's memory is full, the older or more defective
organisms are "killed off." The outcome of this artificial process about which Gell-Mann has the most to say is the evolution of a more compressed
version of the original ancestor, one with only 36 instructions instead of eighty (p 315). He mentions the evolution of no new features in Tierra. Yet the
noteworthy outcome of biological evolution is not the reduction of the instruction set, but the growth of it, and the emergence of new features. As real
life evolves, new genes with new meaning are added to the genome.
The current champion of the theory of complex systems, as it pertains to biology, is Stuart Kauffman, also of the Santa Fe Institute. "Kauffman has
spent decades trying to show—through elaborate computer simulations—that Darwinian theory alone cannot account for the origin or subsequent
evolution of life" (4). Kauffman proposes that a process he has named "autocatalysis" is necessary and sufficient to create life from nonliving
chemicals. Autocatalysis is a chemical process which is enhanced by one of its own products. For example, imagine a pair of complementary
nucleotides that could, as a unit, enhance the pairing rate of the same nucleotides, unpaired. If unpaired nucleotides were constantly supplied, like
"food" for the reaction, autocatalysis would produce many nucleotide pairs. This concept has attracted some attention, but so far its applicability to
biology is tenuous.
In his 1995 book, At Home in the Universe (5), Kauffman also mentions Tierra. To Kauffman what's interesting about Tierra is the fact that the
organisms become extinct. "They disappear from the stage and are heard from no more" (p 238). From this behavior he draws a lesson about the
size and frequency of extinctions in real life, not about evolutionary advances. The evolution in Tierra which he describes is all copying and shuffling;
the process does not generate new programs or subroutines.
Kauffman also discusses another attempt to duplicate evolution in a computer model. The result is negative (p 276-277):
Since the Turing machine and its programs can themselves be specified by a sequence of binary digits, one string of symbols is essentially
manipulating another string. Thus the operation of a Turing machine is a bit like the operation of an enzyme on a substrate, snipping a few atoms
out, adding a few atoms here and there.
What would happen, McCaskill had wondered, if one made a soup of Turing machines and let them collide; one collision partner would act as the
machine, and the other partner in the collision would act as the input tape. The soup of programs would act on itself, rewriting each other's
programs until... Until what?
Well, it didn't work. Many Turing machine programs are able to enter infinite loop and "hang." In such a case, the collision partners become locked in
a mutual embrace that never ends, yielding no "product" programs. The attempt to create a silicon self-reproducing spaghetti of programs failed. Oh
Chris Langton, External Professor of the Santa Fe Institute, clearly believes that it is reasonable to ask for a computer model that emulates life (6):
"...If a programmer creates a world of "molecules" that—by following rules such as those of chemistry—spontaneously organize themselves into
entities that eat, reproduce and evolve, Langton would consider those entities to be alive "even if it's in a computer."
In September, 1995, Langton was asked: if chance can write the new genetic code behind evolutionary progress, then it should be able to write new
computer code. Can it? Langton answered yes (7). His example was a string of computer code two bits long used to specify one strategy in a
computer game simulating evolution (the "Prisoner's Dilemma"). Because mutations are allowed, the string of code can occasionally double to four
bits; sometimes this doubling can lead to a superior strategy.
This example seems weak. If any random process can write computer programs, there should be examples more impressive than the duplication
of two bits. This is equivalent to the insertion of one nucleotide in a real genome. One possible excuse for this weakness is that not enough time
has gone by for self-generated computer programs to emerge. But evolution is a robust process. If new genetic programs can be created without
input in the biological world, wouldn't there be some convincing indication of an analogous process in the computer world by now?
A hyper-parasite (red, three piece object) steals the CPU from a parasite (blue sphere). Using the stolen CPU, and its own CPU red sphere) it is
able to produce two daughters (wire frame objects on left and right) simultaneously. — Thomas S. Ray (8)
The author of the Tierra program, Tom Ray, sees much drama in its evolution. However, he is apparently aware that the evolution it has achieved so
far is well short of that in biology. "It is hoped that with the help of some impulse towards greater complexity, this dynamic can lead to a large
spiraling upwards in complexity." He thinks the needed impulse may be supplied by running Tierra on a worldwide network of computers analogous
to a multicelled creature.
Ray's advocacy of the proposed network sounds a bit like a venture capital prospectus, complete with disclaimers. "Eventually the product can be
neutered and sold to the end user.... it is a venture into the unknown for which we can not estimate the likelihood of success." While we should only
encourage research of this sort, we should be realistic about what it has accomplished so far. As for claims that computers will duplicate the
progress evident in biological evolution, we should maintain a healthy scepticism and wait for some convincing results before we buy in.
|Why not spare your hearing with Acoustic PC’s Quiet Computer Cooling and Soundproofing
Solutions. Acoustic PC Stocks Quiet Computer Parts. Quiet PC Fans, Quiet CPU Coolers, Quiet
Power Supply, Quiet Computer Cases, PC Sound Dampening insulation, Ant-vibration Noise
Reduction materials such as Silicone Fan Mounts, Sorbothane Feet for Ultra Low noise PC's. We
also sell Silent PC Hardware components and Gaming products such as Fanless Silent Video
cards, SSD Drives, Professional Gaming peripherals for serious Gamers. Audio products & more.