You see, it's like this: I made a post peripheral to one of Glenn's, and ultimately I realized it was pretty irrelevant to the topic being discussed. If Glenn had just said, "You're irrelevant," I would have understood and backed off. But instead Glenn responded in great detail. So now I feel a need to take one more shot. If this stirs up more bickering, I promise to make any derivative communication offline, as I suspect the matter will not be of much interest to most list participants.
Glenn: >>First, we need to clarify what type of inversion you are speaking of. If it
is the old 1980s style deterministic inversion, in which low frequency
control is added to the high frequency component from the seismic, then we
are speaking of separate types of processes.<<
Me: You're right, I'm thinking of the "old 1980s style deterministic inversion." Sorry about not paying closer attention to what you were talking about. However, this deterministic inversion is not so obsolete as your comments would suggest. A colleague down the hall, in fact, was still enhancing a sophisticated version of this in 1999, and it was getting a lot of use by operations. To my knowledge, Chevron--the company where this sin occurred--has never had a reputation in the industry of stagnating in some technological backwater. If there were better ways, I trust the operations people would have known about and tried them.
Furthermore, you'd have to argue long and hard, I'm afraid, to convince me that an inversion technique that relies totally on random methods is going to do better as a rule than a deterministic method when there's a straightforward way to do the deterministic calculations and the data satisfy the assumptions of the deterministic method. I speak mostly from ignorance of simulated annealing, but the only cases where I think simulated annealing is likely to be better are where rock structures are such that the assumptions of the deterministic method become invalid. I can imagine this would be the case where...
>>within about
100 m from any well, all bets are off in most reservoirs. There are faults,
there are sedimentological differences, saturation differences etc all of
which affect the velocity and density.<< and...
>> We have volcanic
tuffs....<< and...
>>we had injectite sands which have
dips of up to 80 degrees and widths varying from inches to 300 feet. It was
difficult image some of these injectites. In this situation, one trace is
NOT identical to the nearest neighbor.<<
That is, if subsurface structures are such that you don't have well-defined reflectors but just a jumbled mass of scatterers, the deterministic method doesn't stand a chance. I suspect this is the kind of environment where the simulated annealing method might do better than any other method.
However, I question Glenn's characterization of "most reservoirs" as being of a radically varying sort to the degree that a seismic trace is not similar to its neighbors. That is, variations in reservoirs are often rapid and large, but in many or most cases seismic imaging smooths out all but fairly gross features, partly as a result of the severely band-limited nature of the data. In large parts of the Western Canadian basin, for example, a stacked seismic trace is practically identical--except for a time shift--to any trace recorded up to and often beyond a kilometer away. The features of exploration interest there are usually not far above the visual detectability limit, if indeed they're detectable at all. Data from many other areas are not that uniform but still are often not as different from trace to trace as the stuff Glenn is referring to. (--Well, it's a matter of degree, and without pictures I don't really know what kinds of variations Glenn is talking about.)
I acknowledge that some areas really do have very difficult or jumbled rock structures. Papua New Guinea is an extreme that comes to mind. Chevroids had a terrible time getting any image there at all, even though there appeared to be plenty of energy from deep scatterers. Data from the North Sea that I've seen were most of the time not nearly so challenging.
One other relevant thought is that in many cases it's extremely difficult to assess whether a given geophysical technique is doing any good. That's why the industry has so many wealthy snake-oil salesmen. Did success result from our scientific brilliance or just luck? Much of the time we can't tell. Not that simulated annealing is one of those questionable techniques, but...time will tell (maybe).
>>We WERE using stacked traces. So there is no reduction from this direction.
When did you leave the business?<<
Ya got me there! As soon as I looked a second time at your numbers I realized you were talking about stacked data. Actually, I didn't leave the business until 1999. My problem is not that I'm far behind the times but that I specialized so narrowly that much of the business went its own way without me. I spent most of my career studying effects of velocity anisotropy in shear-wave data. In the early '80s I saw a chance to do something close to real scientific research by focusing on anisotropy, so I jumped at it. That was by far the most interesting thing going in exploration geophysics, even though any applications stemming from this research may not yet have earned a single company a single penny (i.e., apart from the contractors, who made big bucks on our experiments). It was great fun anyway.
>>Would you accept 10^23,000,000,000? If so, I will be able to sleep tonight.<<
Have it your way.
Don
Received on Wed Dec 3 19:36:44 2003
This archive was generated by hypermail 2.1.8 : Wed Dec 03 2003 - 19:36:45 EST