There is a recognized need to combine the skills of geoscientists and engineers to build quantitative reservoir models that incorporate all available reservoir data. These integrated reservoir models are critical for forecasting, monitoring, and optimizing reservoir performance over the life cycle of the reservoir. They will enable reservoir engineers to more accurately perform flow simulation studies, identify permeability flow-paths and barriers, map bypassed oil, and monitor pressure and saturation fronts in the reservoir. All of these are essential for effective reservoir management.
Available reservoir data include conceptual geological models, seismic data, core data, well log data, and historical production data. Each data type carries information, at different scales and with varying levels of uniqueness, related to the true distribution of petrophysical and fluid properties in the reservoir. The challenge is to integrate all of these disparate data into a unified, self-consistent reservoir model. Such a challenge can only be tackled by collaborative efforts among specialists in the different disciplines.
This paper reports on the beginning of a joint reservoir modeling effort among four research groups in the Stanford School of Earth Sciences: SCRF-Stanford Center for Reservoir Forecasting, SUPRI-B- Stanford University Petroleum Research Institute (Reservoir Simulation), SRB-Stanford Rockphysics and Borehole Geophysics Project, and SEP-Stanford Exploration Project. For many years these groups have been studying different aspects of characterizing and modeling reservoirs, with varying amounts of collaboration. For example, SRB and SEP, along with Norsk Hydro, carried out a seismic monitoring feasibility study of the Troll field, involving reservoir modeling and flow simulation, rock physics, and seismic modeling Lumley et al. (1994). SCRF and SUPRI-B have collaborated on building optimum stochastic reservoir models for flow simulation studies. SRB and SCRF are collaborating on how to more effectively and more properly use rock physics and seismic data to constrain geostatistical estimations and simulations. Our approach in the new work reported here was to involve all of the groups in a more systematic way, at all levels of the modeling project.
The work presented is aimed particularly at reservoir production monitoring, involving time lapse or 4-D seismic modeling. The value of repeated 3-D seismic surveys is that they allow us to subtract away much of the geologic uncertainty in the reservoir and overburden, while highlighting temporal changes in reservoir conditions related to production.
One of the goals was to lay the foundations for future collaborative projects among the participating groups and to foster a multi-disciplinary research and teaching environment. The definition of a common data flow and hardware platform is one essential requirement for an effective collaboration among research groups involved in analyzing and modeling large quantities of data. In this project we built tools for exchanging data and sharing software. This sharing was greatly facilitated by the availability of a single computational platform, a 16-processors Power Challenge from Silicon Graphics. This powerful machine was also necessary for the computationally intensive tasks of the project, such as the reservoir geological modeling, fluid flow simulation, and the seismic modeling and imaging.
This article demonstrates a methodology applied to a realistic model study. Our ultimate goal is to apply the method to data acquired on a real reservoir.