Re: [asa] NASA - Climate Simulation Computer Becomes More Powerful

From: Bill Powers <wjp@swcp.com>
Date: Sun Aug 30 2009 - 14:44:18 EDT

I agree and remember much that you recount.
But to say that with 64-bit machines roundoff is not an issue is clearly
false. At LANL we have had 64-bit machines since the 60s, although some
still remember the shift from 32-bit. Those who remember that shift,
also remember the lessons learned from roundoff in nonlinear systems.
For a long time now we have been clamoring for 128-bit machines because
studies of numerical noise, when they are done, indicate that they are
biting us in the rear end now. With the advent of 3D codes and their
eventual inclusion of the full spectrum of physics packages, cycles
increase wildly, making numerical noise the hidden demon that everyone
is afraid to examine.

There were some machines that could implement 128-bit machines via
software. Although this slowed codes down to a crawl, some had the guts
to see what would result. We also had the capability with some machines
to make numerical noise experiments by controlling the number of bits
for real computations. From these results, certain extrapolated results
might be derived.

Still I know of very few codes, certainly at the national labs, that
seriously undertook these kinds of studies.

bill

On Sun, 30 Aug 2009, Rich
Blinne wrote:

>
> On Aug 30, 2009, at 6:58 AM, Bill Powers wrote:
>
>> What many of us at Los Alamos National Labs, where high speed computing has
>> been near to tops in the world since the 50s, is that vector processes are
>> equally important in high speed computing.
>> Vector processes are not currently in vogue (unless things have changed
>> since I've retired). The problem with large numbers of processes is
>> keeping them busy. Ideally, you have large numbers of independent
>> operations. This is often possible in 3D, but not in lower dimensions.
>> Climate modeling is intrinsically 3D. Many physical processes
>> intrinsically require communication between cells (transport processes).
>>
>> The problem is that the national labs in the 90s bought into cheap parallel
>> processors used by the gaming industry, where massive independent
>> operations are the name of the game. It is the gaming industry that
>> economically and in practice drives the computational computing industry.
>> Cray was the last company, perhaps in the world, to be dedicated to
>> scientific parallel processing.
>>
>> Those that are more current can perhaps address this issue better than I.
>>
>> bill
>>
>
> Climate modeling is quite amenable to massively parallel processing. In fact,
> it's one of the few problems that is. That's why multi-core computing is
> still looking for software to take advantage of it outside of science and
> engineering computing. SIMD vector machines such as Cray got overwhelmed in
> capability by the MIMD machines that followed as the result of cheap cores.
> Thus, Cray got bought out by SGI which took advantage of the massively
> parallel machines. The other major development that helped scientific
> community out a lot was the competitiveness of AMD versus Intel ten years
> ago. This pushed Intel to produce 64-bit processors even though the consumer
> and business markets don't need them while the scientific and engineering
> market desperately need them. If you look carefully in the article you will
> note two things mentioned in passing, the memory size and the speed of memory
> access. You need lots of memory and that means 64-bit and 64-bit also means
> that any round-off errors that you are worried about are not an issue. This
> is very important for high speed/high bandwidth interprocessor communication.
> The adjacent grids need to talk to each others so that the boundary
> conditions of the Navier-Stokes equations can be adequately satisfied. [Note:
> the physics of climate modeling is surprisingly simple:
> http://physicsworld.com/cws/article/print/26946/1/PWmod3_02-07 I wish
> modeling nanoscale semi-conductor physics was that simple!]
>
> There is one major reason why weather forecasts are inaccurate. Compute
> power. In order to get an accurate forecast you need to decrease your grid
> size. If you do that you get your accurate "forecast" three days after you
> need it. The same goes when you are trying to forecast climate far into the
> future, you need smaller grid sizes. Current grid sizes are adequate for
> computing average global temperatures but the policy makers want to know the
> local effects like will Lake Mead go dry in 2018. Current simulations show
> there's a 50/50 chance but we need more accuracy and that means more iron.
> This is not a science problem. It's an engineering one.
>
> Rich Blinne
> Member ASA

To unsubscribe, send a message to majordomo@calvin.edu with
"unsubscribe asa" (no quotes) as the body of the message.
Received on Sun Aug 30 14:45:22 2009

This archive was generated by hypermail 2.1.8 : Sun Aug 30 2009 - 14:45:22 EDT