SPE 63281 Maximizing Profitability in Reservoirs using Technologies For Continuous Downhole Pressure Systems 3

SPE 63281

Maximizing Profitability in Reservoirs Using New Technologies For Continuous Downhole Pressure Systems

James L. Buchwalter, SPE, Gemini Solutions, Inc., and Ray E. Calvert, Gemini Solutions, and Colin S. McKay, Wood Group, and Stephen J. Thompson, Wood Group

Copyright 2000, Society of Petroleum Engineers Inc.

This paper was prepared for presentation at the 2000 SPE Annual Technical Conference and Exhibition held in Dallas, Texas, 1–4 October 2000.

This paper was selected for presentation by an SPE Program Committee following review of information contained in an abstract submitted by the author(s). Contents of the paper, as presented, have not been reviewed by the Society of Petroleum Engineers and are subject to correction by the author(s). The material, as presented, does not necessarily reflect any position of the Society of Petroleum Engineers, its officers, or members. Papers presented at SPE meetings are subject to publication review by Editorial Committees of the Society of Petroleum Engineers. Electronic reproduction, distribution, or storage of any part of this paper for commercial purposes without the written consent of the Society of Petroleum Engineers is prohibited. Permission to reproduce in print is restricted to an abstract of not more than 300 words; illustrations may not be copied. The abstract must contain conspicuous acknowledgment of where and by whom the paper was presented. Write Librarian, SPE, P.O. Box 833836, Richardson, TX 75083-3836, U.S.A., fax 01-972-952-9435.

Abstract

Continuous down hole data in conjunction with new reservoir analysis tools made to work with this data have the potential to revolutionize the accuracy of reservoir management. The economic value of continuous down hole pressure data and the array of available options justify the use of these systems in almost all petroleum reservoir developments. The greatest value of these gauges is that with new data analysis tools reservoirs can be accurately managed early in the producing life, thus optimizing both short and long term reservoir management strategies.

Traditionally, the value of these systems has been for completion optimization using a small subset of downhole data. The full value of the complete data stream has been ignored due to the large volumes of data, and the lack of software systems for efficiently working with these data. Consequently the reservoir has not been fully understood.

A system of software tools has been developed to capture the full value of the data from permanent down hole gauges. This new software system automates the filtering of these data in an intelligent fashion. The resulting filtered pressure data can then input into a variety of reservoir analysis tools, for example reservoir simulation programs can now have a continuous reservoir simulation. Reservoir and production engineers can always have the optimal production strategy for the reservoir based on the current data.

These tools are very easy to use, so models can be developed quickly and continually maintained with minimal effort. Typically it takes less than a week to build the initial model, and only a few hours a month to update and maintain an accurate history match.

The paper will include a brief review of the down hole technologies and the software system which makes the data accessible to reservoir simulation and other reservoir analysis tools. Applications for the filtered data in both Gulf of Mexico and North Sea reservoirs will be introduced.

Description of the Data Analysis System

The data analysis system used to perform these studies consists of a data filtering system co-developed by Wood Group and Gemini Solution Inc. This was used in conjunction with Gemini’s commercial PC simulation and mapping software- Merlin and Apprentice. Apprentice is a digitizing, geostatistic characterization, mapping package that can be used to generate maps, calculate volumetrics, and export grid data to various simulators. Merlin is a full-featured black-oil simulator with pre and post processors, and a transparent interface to Apprentice. Both packages run on the PC, and can effectively build models of 100000 cells or more on a high end Pentium PC.

The filtering package combined with Merlin and Apprentice is called Prophet. Prophet provides operators the means to efficiently manage down hole data sets of any size, and then quickly build and maintain accurate continuous reservoir simulations in a “real time” environment for reservoir management.

The filtering system is needed because pressures may be measured on a second by second interval, but only a small subset of these pressures are normally necessary to correctly characterize the reservoir. On the other hand, at selected times such as during a buildup or well choke change pressures are needed on a finer interval for proper reservoir analysis. In addition, most operators would embrace a system that can automatically update to include new pressure data, or even new wells that come on line. The Prophet filtering system provides tools for all of the needs and much more.

The Prophet filtering system works as follows. At the onset of production an operator creates a project that will contain a series of user defined filter files for all well in the reservoir. Examples of possible filters might include a pressure point per day, and a pressure point every 30 minutes unless the pressure changes by more than 5 psi during the 30 minutes. The initial filter files are created to allow the user to start viewing the data on different scales. After the filter files are created, the remaining pressure points can be smoothed, deleted, as added in more detail in selected intervals. The “add data” feature is particularly useful after the data sets become large. For example, after ten years a project will have read tens of millions of pressure points to create the filtered files. Since the project is typically updated weekly or monthly the time for each incremental update was small. Suppose however that after ten years it is realized that it would have been beneficial to add detailed pressure data within a two day interval in year 5. Recreating the entire data stream from scratch using a new filter might be an overnight process. Prophet allows the user to selectively add data only within the interval of interest using new filter constraints. As a result, instead of reading 10 years of data to recreate a new filter file, only the selected days of interest are read, and automatically inserted into the existing filter file. This is of particular usefulness in reservoir simulation where the entire data stream is of value.

Each time a user opens the project any new dates which have pressure data – either earlier or later, are automatically imported. The update can easily be self automated - for example an operator may choose to run the project update at midnight daily or weekly so that the filtered data files are available for review at the commencement of the next business day.

The other components of the Prophet system include an easy to use PC simulation and mapping package which allows models to be quickly built and maintained by typical reservoir and production engineers. In the past, due to the high cost associated with simulation, only the largest reservoirs were simulated. The costs were high due to three main reasons. First, Unix workstations and the geological and reservoir simulation software that run in the Unix environment are expensive to purchase. Second, Unix based hardware and software typically require a significant amount of support to keep the systems up and running. Third, since the interfaces between the user and the calculation modules do not allow one to rapidly build the initial model, run the cases and analyze the results, companies incur the significant expense of training and maintaining simulation experts.

The first two reasons can be mitigated by shifting to the PC environment since high-end PCs now have the power to run large models in a reasonable time period. Both the PC itself and the geological and reservoir simulation software that run on the PC are far less expensive to purchase and support than the Unix workstation-based alternatives. The third reason for expensive simulation studies, the lack of first-rate interfaces that will significantly lower the man-hours and experience required to complete a study, is an issue that has only recently been addressed satisfactorily by systems such as the one described here. Today production and reservoir engineers can use models built on PC’s to manage their assets in real time, and the right answer for the data collected any date is always available.

Data Filtering Examples

Let’s look briefly at some sample raw data sets that were filtered with the Prophet system to see the power of the filter tool. The filtered pressure files below were built from a raw pressure stream of about 120,000 pressures collected over 8 days. The total time required to import this data for multiple filters took less than a minute. About 5 minutes of additional time was required to clean up the data stream using the tools described below. The raw data source for all examples were quartz gauges.

The first data set we will examine is from a well that mistakenly had both good and bad pressures simultaneously returned from the down hole quartz gauge. The filtered data corresponds to a 30 minute 5 psi filter (Fig. 1). Keep in mind that the original raw pressure file for this well contains about 120,000 entries.

After applying a box delete tool the original filter file was reduced to a filtered file (Fig. 2). This data stream proved correct and was acceptable for input into other reservoir analysis systems. The data can be saved in a variety of formats, either in whole or within a user defined range.

In a second example, data filtering, smoothing, and insertion within an interval will be illustrated. The original filtered data file was created using a 30 minute and 5 psi pressure constraint (Fig. 3).

As a first step the erroneous pressures at the bottom of the plot were removed (Fig 4).

Next the spike was examined in closer detail (Fig. 5).

It was unclear at first glance whether the pressure spike was a result of a partial buildup or just some additional bad data. To study the spike in more detail, additional data was added to the spike interval corresponding to a 5 minute and 2 psi filter using the original quartz pressure files (Fig 6).

The data lies almost at identical times and represents a 26 psi spike – adding these additional data showed the operator that this was also bad data and the spike was deleted (Fig. 7).

The last tool applied to the data was a time weighted pressure smoothing algorithm to reduce the entire data stream to less than 60 points (Fig. 8). In summary a 2500 fold decrease in the number of data points was accomplished in this example without an appreciable loss in the data character.

These tools allow large complex down hole pressure (as well as other data streams) to be filtered quickly yield a continuous data stream with variable sample rates suitable for capturing all important events surrounding data changes.

Application - Determining Gas Reservoir Size With Limited Down Hole Pressure Data

Introduction. Historically it can take several years to have a reasonable description of the reservoir, in part because down hole pressures are typically measured yearly at best. The following application shows how a limited data stream of less than 90 days was used to accurately determine the size of a gas reservoir. Rapidly determining the reservoir size can allow operators to use this knowledge to significantly impact early life development decisions to maximize both production and economics, and minimize the need to use sketchy exploration data. In this case the operator was able to accurately and quickly determine the reserves around the well to be able to optimize the development strategy.

In the field case presented, a down hole bottom gauge was run in the first well completion string right above the perforation depth. The gauge is a SDG digital 16K gauges from Wood Group. The reading of the gauges are accomplished through a computer on the platform. The local PC stores the data on a continuous basis and, through intranet, sends a daily file to the company offices. Instantaneous readouts of the data are also accessible from the office. Daily files are processed by the filtering software to visualize trend and when needed, transferred data to simulation software.

A team effort was used to study the reservoir, with both the operator and Gemini Solutions Inc. building models of the reservoir. The operator later reconfirmed results using a commercial well test analysis package.

Methodology for Determining Reserves. The following methodology was used for developing an accurate reservoir description and development strategy.

1.  A simulation model was built based on static information available including seismic interpretations, logs, and cores.

2.  The initial simulation model was used to create a production forecast.

3.  The well was produced for 60 days and the model forecast was compared to the actual producing history.

4.  The possible parameters to be changed in the model in order to match the recorded production were identified

5.  A history match was accomplished after modifying the identified parameters and integrating the recorded production and filtered bottom hole pressure data.

6.  The corrected simulation model was used to re-estimate the original hydrocarbons in place and develop an optimal production strategy before rig demobilization.