Improved Accuracy of EDPMS Systems with a Near-Surface Wellbore Temperature Sensor
posted on 15 May 2019 12:09 PM by DataWise
Introduction
EDPMS systems determine bottom-hole pressure (BHP) by measuring pressure at the top of a column (transducer pressure) of helium (or nitrogen) that is connected (via capillary tubing) to a downhole sensing point. BHP is calculated by adding the column weight of helium to the measured transducer pressure. The column weight of the helium (nitrogen) is calculated using the TVD depth of the sensing point and the density of the helium (nitrogen). Applying the most accurate helium density possible for each measured pressure point allows for the best accuracy EDPMS systems.
Helium (Nitrogen) Density Determination
The density of helium, when using the DataWise’s Intelligent Data Logger (IDL), is calculated for each measured pressure from a helium (nitrogen) density table. The IDL uses two Input variables, the measured transducer pressure and an estimated flowing average wellbore temperature (a single value input variable) to determine helium density from this table. The flowing average wellbore temperature is usually supplied by the end user and should be based on each well’s flowing temperature profile. As the real average wellbore temperature varies relative to the “input” average wellbore temperature, calculated BHP measurements will become less accurate. This is especially true for shut-in build-up or fall-off conditions during which the wellbore temperatures will decrease or increase (injectors and gas wells) toward the geothermal gradient. Determining a more accurate density at the time of each pressure measurement will result in a more accurate BHP.
Real-time Temperature Measurement
Knowing the average wellbore temperature at the time of each transducer pressure measurement allows for determination of a more accurate helium (nitrogen) density, as compared to the standard IDL process, and thus improves the accuracy of the calculated BHP. Ideally, knowing the actual wellbore temperature profile and integrating density and pressure from surface to the bottom-hole sensing point would yield the most accurate BHP for an EDPMS. While this could be done with “Distributed Temperature Sensing” (DTS) technology, the acquisition time is much longer than that of the pressure measurement gauges and the price is much greater.
The most inexpensive and practical method to determine the average wellbore temperature for each pressure measurement is to continuously measure temperature with a single sensor near the surface inside the wellbore (avoiding surface thermal events). Single point temperature sensors can take readings in the same frequency as the pressure gauge. The sensor should be in good contact with production tubing to ensure the fluid in the tubing dominates heat transfer during flowing/injection periods. The sensor depth should also be deep enough to ensure that the annulus surrounding the sensor contains liquid continuously for heat transfer during shut-in periods.
For producing wells, near-surface temperature measurement represents the location where the flowing temperature will be furthest from the geothermal gradient and thus have the most rapid changes in temperature when shut-in. For injection wells, the near-surface may not be optimal and specific well conditions should be evaluated.
Near-surface Temperature Process
Using a simple wellbore model, an “average wellbore temperature table” (AWTT) cross-referenced to the near-surface temperature acquisition depth is created and loaded into the IDL. For every transducer pressure measurement, the near-surface temperature measurement for the same time will be used to access the AWTT to look-up the average wellbore temperature. This temperature would then replace the “input” average wellbore temperature and be used to determine density from the helium (nitrogen) density table. The helium column weight would then change with both changing pressure and temperature as each BHP is calculated.