SCRF 2002-2003

“The annual state of SCRF address”

It has been my pleasure, for the last few years, to write about the state of SCRF in the introduction to the annual report. I can report that the state of SCRF is healthy! Consider the following facts:

  • Our student body has grown to a record of eighteen, without trading quality for quantity. Excellent applicants have been accepted for 2003-2004; a post-doc, Guillaume Caumon (gOcad group), will join us in August.
  • Several papers of last year’s report have been or will be published in leading journals such as SPE Journal, JPSE, Mathematical Geology and the Leading Edge; this quick turn around is no less than a ringing endorsement by the scientific and professional community of our work.
  • 50% of all papers in the “Geostatistics” section of the SPE ATCE 2002 in San Antonio were printed in the last SCRF report, 80% of all papers were from SCRF authors or alumni. Rami Younis (Ms, 2002) won the SPE International Student paper contest in the Ms division.
  • Various companies are undertaking, efforts to test our latest codes on actual reservoir cases.
  • Despite cutbacks, downsizing and mergers, our funding status remains strong thanks to your loyalty.

Yet, these are not reasons to become less vigilant, certainly not less productive in terms of new ideas, practical applications or software. We hope this report demonstrates as such.

In a first paper of this years report, Andre Journel provides the first state-of-the art of multiple-point geostatistics and also of geostatistics in general. You will enjoy reading his trademark delivery of “deliberately subjective”, un-apologetic and thought-provoking statements.

Possibly the most innovative papers in this report is that by Burc Arpat. Burc proposes a new sequential simulation method for generating prior geological structures depicted in a training image (the original idea of Mohan Srivastava in 1992, see SCRF Report 5) constrained to well and seismic data. His method contains two novelties, one related to sequential simulation in general, one related to mp-geostatistics.

First, he proposes a new multi-scale approach to sequential simulation that is an improvement of the multi-grid method. The original multi-grid approach of Tom Tran in 1994 (see SCRF Report 7) has proven to work well in the case of SGSIM (variogram-based) simulations. Recall that in a multi-grid simulation, properties are first simulated on a coarse grid, then frozen and used as conditioning data in subsequent finer grids. The intention is to reproduce better the long-range correlation. Burc recognizes that freezing data on the coarse grid may lead to inconsistencies when attempting to simulate more complex geological structures, particularly when interaction between various geological scales is present such as in highly connected, curvilinear channel systems. A “mistake” made on the coarse grid cannot anymore be corrected or updated by any finer grid simulation. Burc proposes to resolve this issue in two ways: first, at each grid node, he does not simulate a value, but retains the actual cpdf (conditional probability), hence instead of passing hard data from coarse to fine grid, he passes probabilities (cpdfs). Secondly, when simulating on the finer grid, he allows an update of the coarse grid probabilities, in order to correct any possible inconsistencies and to re-introduce the multi-scale interaction lost in the multi-grid approach.

Secondly, he proposes a way of modeling/filtering two-point or any multiple-point statistics derived from the training image. The concept of actually modeling multiple-point statistics is a break from the original idea of Sebastien Strebelle (see SCRF Report 14), whose landmark contribution resulted in the first truly practical mp-algorithm termed “snesim”. The “snesim” program uses a search tree to store multiple-point statistics, scanned from the training image, in the form of conditional frequencies. The modeling of mp-statistics in Burc’s approach relies on two concepts: first, using a template, patterns (mp-statistics) are scanned from the training image and clustered into various (1000’s) groups. The average of each group is termed a “prototype” and represent typical “shapes” or “patterns” present in the training image. In case of a binary training image (e.g. channel vs mud) such prototypes consist of probabilities. Secondly, any comparison between actual data events occurring in the reservoir and a “prototype” is based on the definition of a “similarity” function or distance. Enabling the comparison of various multiple-point data events or patterns allows to filter out insignificant patterns from the training image and enhance the reproduction of crisps shapes. The full algorithm is presented in detail in his paper together with some illuminating examples.

The paper by Tuanfeng Zhang uses a somewhat different approach but is motivated by the same need for a filtering/modeling of mp-statistics. Tuanfeng shows how Principal Component Analysis (PCA) can be used to draw structures from training images and how large scale structures can by captured by the first PCs. Unlike Burc, he uses the “snesim”-way to perform simulation yet shows that the RAM-requirements of snesim is considerably reduced when relying on the first PCs only, without loss of reproduction of the desired continuity. The concept of PCs is powerful since it suggests a potentially broader application of applying linear filters to detect and capture the “essence” of a training image in a few, but relevant statistics.

The new approaches to mp-simulation proposed by Burc and Tuanfeng are still under development, practical code is to be expected by next years SCRF meeting. In the mean time, the current code “snesim10.0” is available for download from our website . The code is fully outlined in a paper by Tuanfeng Zhang. Yuhong Liu describes in detail the impact various input parameters have on the simulation results. This paper may serve as a guideline for a reservoir application of this program.

The variogram is not an accurate measure of multiple-point connectivity, yet an actual measure for such connectivity is not yet available. Sunderrajan Krishnan proposes a measure for curvi-linear connectivity by generalizing the variogram. This would allow comparing various images, or even reservoir models in terms of their curvilinear connectivity, rather than by their (rectilinear) correlation length.

In the history of geostatistics, the second major phase of development came with the advent of stochastic simulation, which allows assessing the impact of uncertainty. Stochastic simulation allows to generate many realization all honoring the same data and depicting a geological continuity either represented by a variogram or by a training image. However, it has been recently recognized that the major source of uncertainty, those that affect the digits before the comma are related to the choice of major reservoir components such as the decision about the appropriate geological concept, the decision about the reservoir structure and net-to-gross ratios. It is less related to the so-called “random-seed” uncertainty, created by the many realizations using a stochastic model where the previous important variables are fixed. The paper of Anuar Bitanov develops a method for quantifying the uncertainty on net-to-gross, which is critical in the appraisal and early development stage, where the well-log and seismic data provide a crucial role. The method relies on the concept of spatial re-sampling or bootstrapping, a method developed by Andre Journel in SCRF report 6.

Where do we get the Training Image ? An often recurring question we face when applying mp-geostatistics. The common solution is unconstrained Boolean model, but such model calls for an oversimplification of real geology. In her quest to address this question, Amisha Maharaja stumbled on a unique dataset in an even more unique place: the Rhein-Meuse delta, located in South Netherlands. An extensive data base is publicly available of some 200,000 lithological borehole descriptions and reconstructed 2D maps of this deltaic deposit starting from the Holocene period. Amisha explores, in her paper, the potential for using well studied deltas as a “mold” for generating training images that could be used to model other ancient delta deposits of hydrocarbon interest.

The next group of papers concerns the problem of history matching. Last year, a new method, now termed the probability perturbation method, was presented, allowing to match history yet maintaining geological control. The method is motivated by the fact that the original goal of history matching is not to history match but to provide accurate predictions of reservoir performance. Such accuracy is not easily obtained nor is there an objective measure of how “good” or “predictive” the model really is, one has to wait for the future to occur. Nevertheless, it seems almost common sense that models can only be predictive if one of the main driving and invariant mechanisms for flow, namely geological heterogeneity, is represented as realistically as possible. Trading geological realism, the driver for accurate prediction, for accuracy of matching past (known!) performance does not seem desirable. Directly perturbing the reservoir at the grid block, either manually or automatically using sensitivity coefficients level often leads to good matches, but destroys the geological realism be it represented by a variogram or a training image.

The probability perturbation proposes to perturb the probability distributions that are used to generate the reservoir probabilities. In that way it is easier to ensure that geological continuity and the seismic relationships are maintained. Various new contributions, including an eye-opening case study is presented using this method.

  • Junrae Kim presents work along the same line of thinking as Anuar, but using dynamic data: the most critical task in history matching is actually to find a good geological or stochastic model, not so much finding an actual realization of that model. It is more critical to perturb important parameters such as net-to-gross, instead of freezing such parameters and searching for a realization that matches history (perturbing the random seed). He proposes an efficient way to perform history matching by perturbing such critical parameters.
  • Todd Hoffman extends on the probability perturbation method by proposing a region-wise perturbation of the reservoir. Such regions can be defined by geology or streamlines. His method is more efficient than the original probability perturbation which perturbs the model globally. Yet, he ensures that no artifact discontinuities at the region borders occur. Coupling his idea with Junrae’s perturbation of net-to-gross, a region-wise perturbation of net-to-gross can be achieved.
  • Satomi Suzuki presents a hierarchical perturbation for history matching, where both facies distribution in space and within facies heterogeneity are perturbed
  • Joe Voelker takes up the challenge to apply the probability perturbation method to history matching flow-meter data in the world’s largest onshore oil field: the Ghawar field in Saudi Arabia, a mixed carbonate/clastic reservoir. Combining his skill of a true reservoir engineer (with 17 years of experience !) and his acquired stochastic thinking, he tackles the important problem of characterizing a complex reservoir where both facies as well as fracture distribution is critical. He faces a number of serious challenges that warrant by-passing some classical thinking: first, core permeability measurements at the well do not correlate with flow-meter data. Hence deriving kH from flowmeter for matching history (the traditional approach) would simply produce inaccurate future predictions. Secondly, while fractures are suspected to be an important driving source for the extremely high fluid production measured from flowmeter data, models such as dual porosity cannot represent such discrete fracture behavior. A new, effective fracture model is presented. This is combined within the probability perturbation method to achieve a successful history match, yet preserving the important geological concepts.

Even though the probability perturbation method accounts for geological continuity information and static data such as well-log and seismic, it cannot be applied as is in a practical reservoir setting. To account for static information, fine-scaled reservoir models need to be generated; However, in most practical cases, on cannot run a single flow simulation on such fine scale realization, let alone multiple ones, as required by the iterative history matching procedure. Inanc Tureyen proposes a solution to this problem by making upscaling a part of the history matching procedure. Any fine scale realization is first upscaled, then flow simulation is evaluated on the upscaled realization. When perturbing the fine-scale simulation model, the upscaled and fine scale flow response must agree, otherwise any perturbation calculated on the coarsened model would be uninformative to perturbations of the fine scale. Using a fast approximate flow simulation model, Inanc introduces an additional optimization of the gridding parameters prior to upscaling such that the fine and coarse scale models agree in terms of flow response. Finally, when a history match on the coarse scale is achieved, a satisfactory match of the fine scale models is also achieved.

The advent of time-lapse seismic has opened the door to detecting fluid saturation changes directly from seismic surveys. While clear success stories have been reported in cases where fluid properties differ considerable (e.g. gas injection), other less favorable cases are often treated qualitatively, or 4D seismic has been dismissed entirely (e.g. carbonates). Jianbing Wu shows that the use of a more detailed statistical analysis may help improve the merely qualitative visual interpretation. Jianbing argues that, given the low resolution of actual seismic data, one should not expect a point-to-point correlation between saturation changes and seismic time-lapse. He argues for a correlation between the spatial patterns of seismic time differences and corresponding spatial patterns of seismic saturation changes. PCA comes in as a useful tool to quantitatively calculate such correlations. A demonstration of his thinking is succinctly demonstrated on the Stanford V reservoir.

Yuhong Liu argues in her paper in favor of such pattern correlation, moreover demonstrates it in an actual reservoir case involving 3D seismic. A work that is in joint collaboration with Andrew Harding and Bill Abriel fromChevronTexaco.

Last but not least, I would like to draw your attention to the presentation of a new software around the GsTL (geostatistical template library in c++). We recognize the desire of many of you to apply the methods developed at SCRF to your reservoir cases. Nicolas Remy will present you the initial design of a GUI around some of the recent software written (e.g. snesim) with some 3D visualization capabilities. The software could be the seed code for development of an industrial software.

As always are we looking forward to your critical analysis of our work !

Jef CaersStanford, April 28, 2003