On Aug 30, 2009, at 6:58 AM, Bill Powers wrote:
> What many of us at Los Alamos National Labs, where high speed
> computing has been near to tops in the world since the 50s, is that
> vector processes are equally important in high speed computing.
> Vector processes are not currently in vogue (unless things have
> changed since I've retired). The problem with large numbers of
> processes is keeping them busy. Ideally, you have large numbers of
> independent operations. This is often possible in 3D, but not in
> lower dimensions. Climate modeling is intrinsically 3D. Many
> physical processes intrinsically require communication between cells
> (transport processes).
>
> The problem is that the national labs in the 90s bought into cheap
> parallel processors used by the gaming industry, where massive
> independent operations are the name of the game. It is the gaming
> industry that economically and in practice drives the computational
> computing industry. Cray was the last company, perhaps in the
> world, to be dedicated to scientific parallel processing.
>
> Those that are more current can perhaps address this issue better
> than I.
>
> bill
>
Climate modeling is quite amenable to massively parallel processing.
In fact, it's one of the few problems that is. That's why multi-core
computing is still looking for software to take advantage of it
outside of science and engineering computing. SIMD vector machines
such as Cray got overwhelmed in capability by the MIMD machines that
followed as the result of cheap cores. Thus, Cray got bought out by
SGI which took advantage of the massively parallel machines. The other
major development that helped scientific community out a lot was the
competitiveness of AMD versus Intel ten years ago. This pushed Intel
to produce 64-bit processors even though the consumer and business
markets don't need them while the scientific and engineering market
desperately need them. If you look carefully in the article you will
note two things mentioned in passing, the memory size and the speed of
memory access. You need lots of memory and that means 64-bit and 64-
bit also means that any round-off errors that you are worried about
are not an issue. This is very important for high speed/high bandwidth
interprocessor communication. The adjacent grids need to talk to each
others so that the boundary conditions of the Navier-Stokes equations
can be adequately satisfied. [Note: the physics of climate modeling is
surprisingly simple: http://physicsworld.com/cws/article/print/26946/1/PWmod3_02-07
I wish modeling nanoscale semi-conductor physics was that simple!]
There is one major reason why weather forecasts are inaccurate.
Compute power. In order to get an accurate forecast you need to
decrease your grid size. If you do that you get your accurate
"forecast" three days after you need it. The same goes when you are
trying to forecast climate far into the future, you need smaller grid
sizes. Current grid sizes are adequate for computing average global
temperatures but the policy makers want to know the local effects like
will Lake Mead go dry in 2018. Current simulations show there's a
50/50 chance but we need more accuracy and that means more iron. This
is not a science problem. It's an engineering one.
Rich Blinne
Member ASA
To unsubscribe, send a message to majordomo@calvin.edu with
"unsubscribe asa" (no quotes) as the body of the message.
Received on Sun Aug 30 10:44:47 2009
This archive was generated by hypermail 2.1.8 : Sun Aug 30 2009 - 10:44:47 EDT