On Aug 24, 2008, at 6:55 PM, William Hamilton wrote:
> OK. Thanks for clearing up the points I asked about. I was thinking
> in terms of computer models, and you were thinking in real world
> terms. In the real world there is no static fitness function. It may
> stagnate for a long time, but then the environment changes.
>
> I have no confidence whatever in the ID crowd's views on genetic
> algorithms. I believe it was Bill Dembski who criticized genetic
> algorithms because the fitness function was prespecified. He thought
> that was cheating. I call it setting a goal -- without which you
> can't expect an algorithm to converge to anything meaningful. And
> yes, I agree that gradient methods and Newton's method are very
> susceptible to local minima. GA's avoid that problem. Simulated
> annealing is said to also, but I haven't used SA, so I can't testify
> from personal experience.
>
>
I have used SA and it does exactly as you say. Both algorithms take
advantage of clustering phenomenon also found in spin glasses. You can
take an NP complete search problem and get a quick answer if there
exists a "cluster" nearby and let randomness get you to a "nearby"
solution in a reasonable time. If there is no nearby solution, you are
hosed. See the following on how the use of the spin glass analogy was
used to solve the KSAT problem here:
http://www.sciencemag.org/cgi/content/full/sci;297/5582/812
> The K-satisfiability problem (Ksat) asks whether one can satisfy
> simultaneously a set of M constraints between N Boolean variables,
> where each constraint is a clause built as the logical OR involving
> Kvariables (or their negations). Ksat is at the core of
> combinatorial optimization theory (1) and often serves as a
> benchmark for search algorithms in artificial intelligence and
> computer science. An efficient algorithm for solving Ksat for K
> 3 would immediately lead to other algorithms for efficiently
> solving thousands of different hard combinatorial problems. The
> class of combinatorial problems sharing such a crucial feature is
> called NP-complete (2), and it is a basic conjecture of modern
> computer science that no such efficient algorithm exists. Algorithms
> that are used to solve real-world NP-complete problems display a
> huge variability of running times, ranging from linear to
> exponential. A theory for the typical-case behavior of algorithms,
> on classes of random instances chosen from a given probability
> distribution,is therefore the natural complement to the worst-case
> analysis (3-5). Whereas 1sat and 2sat problems are solved
> efficiently by polynomial time algorithms (6), K > 2 randomly
> generated Boolean formulae may become extraordinarily difficult to
> solve: It has been observed numerically (7, 8) that computationally
> hard random instances are generated when the problems are critically
> constrained [i.e., close to thesatisfiable-unsatisfiable (SAT-UNSAT)
> phase boundary]. The study of critical instances represents a
> theoretical challenge toward a greater understanding of the onset of
> computational complexity and the analysis of algorithms. Moreover,
> such hard instances are a popular test bed for the performance of
> search algorithms (9).
>
> The random Ksat problem has close similarities with models of
> complex materials such as spin glasses, which are fundamental
> systems in the statistical physics of disordered systems (10). Spin
> glasses deal with binary variables ("spins") interacting with random
> exchange couplings. Each pair of interacting spinscan be seen as a
> constraint, and finding the state of minimal energy in a spin glass
> amounts to minimizing the number of violated constraints. Although
> the constraints in spin glasses and Ksat differ with respect to
> their precise form, in both cases the difficulty comes from the
> possible existence of "frustration" (11), which makes it difficult
> to find the global optimal state by a purely local optimization
> procedure. Links between combinatorial optimization and statistical
> physics have been known for years(10, 12, 13). Techniques from
> statistical physics are particularly useful when the size of the
> instance is large.
>
The review article noted the following:
> How clusters defy satisfiability. The clustering phenomenon is
> typical of the onset of a glassy phase. In satisfiability, the set
> of solutions is connected if there are not too many constraints per
> variable (left). It splits into disconnected clusters when the
> number of constraints per variable
> crosses the clustering threshold
> d. The clusters shrink further when
> increases, and finally disappear at the satisfiability threshold
> c. The full space, depicted here as a sphere, is an N-dimensional
> hypercube.
This is why I say evolution solves problems one (or a few) variables
at a time, jumping from cluster to cluster. If the problem is not
amenable to that solution it doesn't try to solve it. That's because
GA like other NP complete heuristics goes to exponential time for non-
clustered solutions (like my previous beavers with chain saws
example). If you limit yourself to finite time then what gets produced
are those problems that are easily solved by GA where a brute force
search might find other more "optimal" answers. Given this this is
what you would expect if life really was designed through a GA.
1. Slow change of the genetic code with time. (Because nearby
solutions are preferred by GA.)
2. Large portions of the genetic code that don't change at all.
(Because of clustering.)
3. Suboptimal "solutions". (Because GA is a heuristic of an NP
complete search problem.)
What do you know? That's exactly what we see.
Rich Blinne
Member ASA
To unsubscribe, send a message to majordomo@calvin.edu with
"unsubscribe asa" (no quotes) as the body of the message.
This archive was generated by hypermail 2.1.8 : Sun Aug 24 2008 - 21:37:59 EDT