OK. Thanks for clearing up the points I asked about. I was thinking in terms
of computer models, and you were thinking in real world terms. In the real
world there is no static fitness function. It may stagnate for a long time,
but then the environment changes.
I have no confidence whatever in the ID crowd's views on genetic
algorithms. I believe it was Bill Dembski who criticized genetic algorithms
because the fitness function was prespecified. He thought that was cheating.
I call it setting a goal -- without which you can't expect an algorithm to
converge to anything meaningful. And yes, I agree that gradient methods and
Newton's method are very susceptible to local minima. GA's avoid that
problem. Simulated annealing is said to also, but I haven't used SA, so I
can't testify from personal experience.
On Sun, Aug 24, 2008 at 7:13 PM, Rich Blinne <rich.blinne@gmail.com> wrote:
>
> On Aug 24, 2008, at 3:39 PM, William Hamilton wrote:
>
>
> I find much that I agree with in this thread -- and some things that are
> puzzling. As Iain pointed out, when the input variables are coupled,
> changing a number of inputs (not necessarily all) simultaneously makes
> sense. If you can compute the gradient, then you can use steepest descent or
> the conjugate gradient method or Newton's method to determine how much to
> adjust each input variable.
>
>
> The problem why GA and simulated annealing are used rather than Newton's
> method is to avoid a local minima which your discussion below alludes to.
> You change too many variables simultaneously and you get too greedy. Maybe a
> clarifying note here will be helpful. When I am talking about vary multiple
> variables I am talking about varying multiple variables that all lower the
> cost function. (See my next paragraph) If you dive too quickly to a local
> minima you might get "stuck".
>
>
> I find this puzzling. I think the criterion for using a GA is the form of
> the fitness function. If it varies smoothly and (especially) if its gradient
> can be computed or approximated without undue computational effort then
> steepest descent, conjugate gradient or possibly Newton's method are
> appropriate. If it has a complex and/or nondifferentiable structure, then
> GA's are appropriate. For example if you were trying to maximize
> (1-x^2)*sin(1/x) on [-1,1] a gradient algorithm wouldn't be appropriate
> because the function is not differentiable at 0. Yet a GA would (I haven't
> tried it but I'm pretty sure this is so) find a point near zero. (The
> function isn't even defined at zero, but the peaks will increase in value
> as the search point tends toward zero. )
> In a multivariable optimization problem a GA will vary many, perhaps all,
> of the input variables, in possibly large steps. Chaos doesn't result
> because the unsuccessful variations are eliminated in the next generation.
>
>
> As you stated unsuccessful states get pruned by GA. What the effect of what
> was proposed by UcD was a set of large *net* changes which would
> presumably be after pruning and that would be chaotic. What we are
> illustrating is that a large number of simultaneous *deliberate *changes
> is inferior to a large number of random changes subject to selection as a
> design strategy. Selection keeps that rate under control. When I am
> designing in a deliberate fashion rather than running a brute-force computer
> model I better not be turning all the knobs simultaneously. If something
> goes wrong I don't have a clue why.
>
>
> Agreed. Genetic algorithms work well in computers and in nature because
> they are brute force paralel algorithms. Use a large enough population and
> have available a rich enough set of variations, and eventually you will get
> an improvement in fitness. And as Iain pointed out, such a population can
> track a changing fitness function. (I don't understand Rich's comment about
> how static fitness function can lead to extinction. Shouldn't it just lead
> to a fairly stable population that wanders around the peak of the fitness
> function?)
>
>>
>>
> If your cost is static then the diversity that is producing is small. As
> you rightly say, it just wanders around the peak. If such an environment
> gets "shocked" it cannot respond well to the shock and all the selection
> does is to select everything away and thus mass extinction. Take a modern
> example. If you have a stable climate and have life that's adapted to it
> there is very little to select from when you shock it with a large
> anthropogenic climate change. This is why mass extinction rather than a
> burst of biodiversity is being predicted given AGW.
>
> Rich Blinne
> Member ASA
>
-- William E (Bill) Hamilton Jr., Ph.D. Member American Scientific Affiliation Rochester, MI/Austin, TX 248 821 8156 To unsubscribe, send a message to majordomo@calvin.edu with "unsubscribe asa" (no quotes) as the body of the message.Received on Sun Aug 24 20:55:47 2008
This archive was generated by hypermail 2.1.8 : Sun Aug 24 2008 - 20:55:47 EDT