Sie sind auf Seite 1von 10

SPE-184371-MS

Artificial Neural Network Model for Predicting Wellbore Instability


E. E. Okpo, A. Dosunmu, and B. S. Odagme, University of Port Harcourt

Copyright 2016, Society of Petroleum Engineers


This paper was prepared for presentation at the SPE Nigeria Annual International Conference and Exhibition held in Lagos, Nigeria, 2 4 August 2016.
This paper was selected for presentation by an SPE program committee following review of information contained in an abstract submitted by the author(s). Contents
of the paper have not been reviewed by the Society of Petroleum Engineers and are subject to correction by the author(s). The material does not necessarily reflect
any position of the Society of Petroleum Engineers, its officers, or members. Electronic reproduction, distribution, or storage of any part of this paper without the written
consent of the Society of Petroleum Engineers is prohibited. Permission to reproduce in print is restricted to an abstract of not more than 300 words; illustrations may
not be copied. The abstract must contain conspicuous acknowledgment of SPE copyright.

Abstract
Drilling activities have progressed to deep and ultra deep seas in recent times and with it comes more
challenges. Due to the difficulty of directly obtaining important parameters like in-situ stress and fracture
gradient, simple models have been evolved. This study is a novel attempt to make up for the gap inherent
in such models namely that they neglect chemical and thermal effects, settling for only effective stress and
a time-dependent analysis. The study applied the Neural Network (NN) technology to predict geomechanical parameters. Neural Network (NN) as a branch of Artificial intelligence (AI) possesses the ability
of training available parameters to replace data that cannot be immediately or easily acquired. Data of a
well drilled in the Niger Delta Region of Nigeria was used as the case study. A training set of input data
was used to train the network and a validation set ensured a completely independent measure of network
accuracy. A Neural Network model was developed in Neuroph Studio, Java neural network platform and
the Netbeans IDE. The model has the advantage of being easy to use, open source, cross-platform and
generally designed to save the cost associated with wellbore instability.

Introduction
Efficient exploitation of oil and gas has been hinged on adequate knowledge and application of rock
mechanics amongst other factors like civil unrest, technology and government austerity programs. When
a geomechanical model has been developed that properly measures pore pressures, rock properties,
principal stress magnitude and orientations then the possibility of predicting wellbore instability becomes
more feasible. Prior to drilling a well, the strength of the rock at a particular depth is in a state of
equilibrium with the in-situ rock stresses. Drilled wellbore usually introduces tangential, axial and radial
stresses. If the redistributed stress-state exceeds the rock strength, that equilibrium is disturbed and what
follows is a potential instability of wellbore as the support materials around the hole becomes loose.
During wellbore integrity analysis, certain parameters are seen as very valuable or effective. Example
are porosity, hole size, in-situ stress, fracture gradient, rock strength properties and permeability. Wellbore
instability can be prevented mainly by properly adjusting the mud pressure, Al-Ajmi, and Zimmerman,
R. W.(2006). Regrettably though, some of these parameters like in-situ stresses, fracture gradient and rock
strength are usually unavailable directly because of how expensive and time consuming it is to get them
through tests. The good news however, according to literature, is that they are dependent on other
parameters and can be estimated as function of these parameters, Mallah, A. and I.S., Nashawi (2005). Its

SPE-184371-MS

therefore imperative to model the wellbore integrity with parameters or inputs that have proven to have
meaningful effects on the behavior of a well. Hubbert and Willis (1957) developed a mechanical wellbore
stability which primarily assumed a linear elastic stress pathway around the wellbore. In recent times
many models have also been developed in an attempt to improve on the works of other persons and make
hydrocarbon exploitation more profitable and cost effective.
Artificial Neural Network (ANN) has been particularly successful in pattern recognition and phase
behavior prediction of petroleum engineering interests. Example is the fuzzy logic systems which have
been put to work in the area of reservoir characterization, Lim and Lim (2004).
ANNs are information processing networks inspired modeled after the human brain. With highly
demonstrated potentials to predict wellbore integrity concerns, they have even been known to estimate
unavailable data accurately by using available drilling, geology and reservoir data. Figure 2.8 is a an ANN
model presented by Demuth, H. and M., Beale. (1998).

Figure 2.8 Typical Artificial Neural Network, Demuth, H. and M., Beale. (1998)

This research work aims at designing a solution to wellbore stability problems where some data are not
immediately available. Embracing chemical, mechanical and other wellbore effects, it is expected to
address the shortcomings of uncertainty in the input data needed to run the analysis of efficient and robust
wellbore concerns. It will also be beneficial to oil and gas industry experts and investors since a lot of cost
are expected to be saved by its application. After careful analysis using Neuroph Studio, the trained dataset
indicated high accuracy in the experiment.

Methodology
In this study, three software simulations were used. They include simulations from the Netbeans
Platform, Java Language and The Neuroph Studio. The Java Language is a platform-independent and
powerful programming language that was developed with the central idea of writing applications that can
be run anywhere or on any platform. What that means is that code written and compiled in a given
platform like Windows can run in other platforms like Mac and Linux Operating System and even Mobile
Platforms. Figure 3.1 is representative symbol of the Java language and Neuroph Studio. Remarkable
features of Java that makes it a language of choice for this study is the fact it is a freeware and mostly
open-source which implies that there is no financial obligations attached to using it. Another good thing
is that it can be used in many platforms with ease.

SPE-184371-MS

Figure 3.1The Java Technology Logo and the Neuroph Studio

The NetBeans Platform includes Application Programming Interfaces (APIs) that makes it easy to
handle typical windows application. Netbeans is designed and implemented using the Java language.
Therefore using modules which comprises of libraries of codes, applications can be easily developed in
Netbeans. One remarkable thing about the Netbeans Platform that has made it useful for this project is that
it has a module that accepts pluggins like the Neuroph Studio.
Neuroph Studio is a development environment especifically meant for neural network and to work
seamlessly with the Netbeans and Java platforms. With this integration, developers will not have to
independently implement neural network solutions and Java language input. As a special boost to this
research work, Neural Networks are good at dealing with incomplete data.

Procedure for Training the Neural Network to Predict Wellbore Instability


The flow chart for the training process is given in figure 3.4. This flowchart depicts the practical
process of training a dataset in the Neuroph Studio. The summary of the whole procedure is explained as
follows:

Figure 3.4 Flow Chart for Training The Neural Network To Predict Wellbore Instability.

SPE-184371-MS

Step 1. Dataset Preparation


Training is a necessary to certify a neural network intelligent enough to be used. Neural networks are
trained using training sets. An important prerequisite before training is to normalize the dataset to make
it acceptable for processing. Values of all attributes are integer values so the standard Min Max
normalization formula is used which is in the form of the equation given below:

Where: B is the standardized value; A is the given value. D and C are both used to determine a valid
range for the value. Another method of normalization is by computing the inverse of the values in. Its
noteworthy to mention that the main criterion for normalization or transformation is choosing a method
of what works well for the process. In this case the inverse of the values proved a faster and efficient
method and so was adopted in this research work.
In this research work, if an example item belongs to the first class whose value is 1 as an output, then
wellbore is interpreted as being unstable, since the first output value is 0, and that of the second output
is 1. In the same vein, if an instance belongs to the second class whose output value is 2, then the wellbore
is stable, since the second output value is 1, and the first output value will be 0. This arrangement works
well for this study since the values in the dataset is 0s and 1s. Figure 3.5 is a cross section capture of the
normalized dataset. The coloured column is normalized and the full data can be viewed in Appendix A

Figure 3.5A Cross Section of the Normalized Dataset

It is also imperative to mention here that prior to uploading the data into the Neuroph Studio IDE, some
of the data that are not available on the original data sheet is provided for, by using the principle of
numbering classes. This principle involves converting non-numeric data to acceptable numeric forms by
assigning numeric values to them. For example, Water-Based Mud was assigned the value of 1 while
Oil-Based Mud was assigned the value of 2.
Step 2. Creating a New Neuroph Project
After normalizing the data, it is saved as comma separated file with the extension,. csv file to get it set
as a new training set. A new project is then created by following the process. Figure 3.7 is a pictorial view
of this step. At the project node of the menu bar, the new project is clicked and the procedures followed
to create a new project.

SPE-184371-MS

Step 3. Developing a Training Set


First of all, right-click the newly created project, and then select New, followed by Dataset. The
dataset name for this experiment is ProjDataSet. To ensure that minimal error is encountered during
training, supervised training is chosen and then processed by inputing into the neural network a set of
sample together with expected outputs from respective samples which will then become the sample data
set for this research work.
A well known form of learning in neural network is the supervised learning where iterations or training
continues until a target output is achieved with attendant small error rate. As supervised training proceeds,
iterations also continues, until the output of the neural network matches the anticipated output, with a
reasonably small rate of error. In neuroph studio, the error rate is seldom set at 0.01 for good results.
Next, the number of input is set to 10, because there are 10 input attributes from the experiment, and
the number of outputs is 2. Since the input data is too large, the load from file option is chosen to upload
the comma delimiter file. Sometimes the IDE (Integrated Development Environment) rejects the file. If
that happens, it means there is need to check the values to ensure they are all in the range of 0 to 1.
Step 4. Developing the neural network
The first neural network that was trained in this experiment was named WellborePredictionNet. It was
create it by right-clicking the new project in the Projects window, and then clicking New and Neural
Network. The name and the type of the network is then set. Multi Layer Perceptron is also selected. Multi
layer perceptron is commonly used neural network classifier because of its ability to model complex
functions, and adapt itself to noisy data and other environment changes. It is also easy to use and serves
as a good starting point for new comers in the field. Figure 3.6 shows the process of developing a new
project with a dataset.

Figure 3.6 Developing a dataset in a new project

Figure 3.7The Neural Network for Predicting Wellbore Instability is Created.

SPE-184371-MS

Step 5. Training the Neural Network


To train the dataset that was created, drag and drop it and click Train. A new window pops, where
learning parameters are set. A low maximum error value of 0.02 is set, with momentum and learning rate
of 0.4 and 0.3 respectively. The size of the steps in which the algorithm will take defines the Learning rate.
With the train button clicked, the Total Network Graph pops up and displays the iterations and the error
levels. The network error graph and the total error are shown in figure 3.8. and figure 3.9 respectively.
After the network is trained, we click Test, in order to see the total error, and all the individual errors.

Figure 3.8 Network Error Graph

Figure 3.9 Total Error Display.

The results indictates a total error of 0.297020. This result is quite poor and indicates bad selection of
learning parameters.
Step 6. Retraining
Retraining is a necessary process taken to help reduce the total net error to a desirable value and
therefore achieve an optimum network. Training is usually continued until this level is attained. This
process involves adjusting the weights or parameters.

Advanced Training Techniques


When training completes, the network performance is assessed. Sometimes the aim of the training may
not be met as correct output may not be generated. This will therefore lead to some sort of dissatisfaction.
Therefore, it is desirable to come up with some form of regularization.

SPE-184371-MS

One know form of regularization is to divide the dataset into validation and training sets then compute
the validation error rate at intervals. Table 1 shows data randomly selected for validation purposes.
Training will continue and finally stops when the validation error changes with an upward.

Table 1Randomly Selected Test Data to Validate the Guess.


Expected Output 1

Expected Output 2

Obtained Output 1

Obtained Output 2

0
0
0
0
0

1
1
1
1
1

0.212883
0.535963
0.290078
0.236215
0.175651

0.787116
0.464037
0.709921
0.763785
0.824348

Acquisition and Analysis of Data


Data used in the study was extracted from an SPDC well located in Niger Delta. Extracted data was
derived from four sections of the wellbore. About 85 parameters formed part of these data and they
spanned drilling parameters, survey parameters, drilling fluid parameters, geology parameters, Stabilization parameters, bit parameters, string parameters and BHA parameters. Table 2 is some of the input data
used for the study.

Table 2Input Parameters used in the Neural Network


S/N

Parameter

Minimum

Maximum

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26

Drill Time, hrs


Circulation Time, hrs
Total Time, hrs
Avg. ROP, ft/hr
Surface RPM, kft.lb
WOB, kft.lb
Flow, USgal/min
Pressure, psi
Mud Weight, sg
Measured Depth, ft
Torque on, kft.lb
Torque off, kft.lb
Mud Type
Formation Type
Initial Gel Strength of Mud
Yield Point of Mud, lbf/100sq.ft
Plastic Viscosity of Mud, cP
Temperature, deg C
Measured Depth, ft
True Vertical Depth, ft
Inclination, Deg
Azimuth, Deg
Vertical Section (VS), ft
North, ft
East, ft
DogLegSection (DLS), deg/100ft

0.3
0.2
0.2
20
20
0
500
350
0.929932400
436.06
0
1
1
1
0
0.7
1
14
0
0
0
150.440
0
0
0
0

1
1
1
468
70
45
900
1750
0.959232614
9037.58
12
7
3
3
9
42
58
110
9037.58
8930.23
12.543
48.208
997.38
676.82
732.60
0.09

SPE-184371-MS

Finally after careful consideration 10 input data were gotten from this list. This input parameter list
ranges from drill time, inclination, azimuth to depth and are presented in Appendix A.
As part of efficient manipulation of Artificial Neural Network, 70 percent of the data was used to train
the classifier while 30 percent was used to test the classifier.

Extracting Complex and Non-Numeric Data


The strength of this study lies in the capacity of ANN to provide for data were values for some parameters
are not available in a given data sheet. Again there are cases were some parameters are non-numeric. For
example formation type and mud type are non-numeric or textual in nature. However ANN cannot be
trained with textual data therefore they had to be converted to textual form. On way of doing this is to
assign numbers to each of these parameters. Hence oil base mud was assigned the value of 1 while water
base mud was given the value of 2.

Results and Discussion


This study uses a neural network as a classifier to predict the wellbore instability from drilling, geology
and reservoir information of a drilled well. The effective parameters like hole size, open hole time, drilling
fluid circulation rate, mud properties, drilling string trip, bottom hole pressure, bottom hole assembly,
wellbore orientation, open hole time, equivalent circulation density, in-situ stress, pore pressure, rock
strength properties, fracture gradient, formation type, porosity and permeability have been shown to be
influential on wellbore stability. These parameters were used as inputs to a developed ANN with the
instability of the wellbore as the target and its expected to discover if the wellbore is stable or not using
the designed ANN model.
Multi Layer perceptrons are neural networks that have one or more layers between input and output
layer and is feedforward in nature where data is seen as flowing in one direction from input to output
layers. Its simplicity and easy to use nature makes it useful in predicting. It has been successfully applied
to many pattern classification problems, Demuth, H. and M., Beale. (1998).
So, in this study, which was done using Neuroph Studio, a one hidden layer feed forward network is
created with 10 neurons in the hidden layer as the optimal number of the neurons in a single hidden layer
network using trial and error method. Table 4 is shows the results for testing subset, based on the accuracy
of the classification. The samples are automatically divided into training and testing sets. The training set
is used to teach the network and the test set provides a completely independent measure of network
accuracy. Different solutions which were tested in this experiment have shown that the choice of the
number of hidden neurons is crucial to the effectiveness of a neural network. Also, the experiment showed
that the success of a neural network is very sensitive to parameters chosen in the training process. The
learning rate must not be too high, and the maximum error must not be too low. The results have shown
that the total mean square error does not reflect directly the success of a network. In the end, after
including only 10% of instances in the training set, we learned that even that number can be sufficient to
make a good training set and a reasonably trained neural network.
Promising results of ANN models imply that the input parameters have been chosen properly and also
ANN designing has been optimally done.

Conclusions

This study introduces a new ANN approach to predict wellbore instability in drilled wells. A case
study was done in one of the Niger Delta oilfields in Nigeria. ANNs have proven themselves as
proficient classifiers and are particularly well suited for addressing non-linear problems.
With the nonlinear nature of real world occurrences, like wellbore instability, ANNs are certainly
a good candidate for solving the problem. This model is based on readily accessible parameters

SPE-184371-MS

that can often be satisfactorily achieved from previous drilling, geology and reservoir data. In
addition, some parameters like in-situ stresses, rock strength properties and fracture gradient are
not available directly.
Meanwhile ANNs have the ability to represent complex situations due to effective parameters,
hence by considering valuable parameters on in-situ stresses, rock strength properties and fracture
gradient in the developed ANN model, the impact of these parameters will be included in the
model, indirectly. In this study not only the traditional mechanical failures, but also chemical and
thermal failure mechanisms have been analyzed and included in the proposed ANN model.
The developed ANN methodology can help drilling engineers to estimate the risk of wellbore
instability occurrence not only during well planning but also during drilling as a real-time analysis.
A proper prediction of wellbore instability will give a viewpoint about the main causes of the
problem and consequently the best techniques to prevent wellbore instability as well as optimizing
the effective parameters.

Recommendations

Developing a model in Neuroph Studio to predict wellbore instability problems require a lot of
computer resources like CPU and memory. Its therefore advised that a good computer with a
minimum configuration of dual core processor, 2.20GHz CPU speed and 64-bit Operating System
should be used.
Further studies can be done using other IDEs or tools like Matlab and then a comparative analysis
carried out using more diverse case studies of instability in various wells.

References
1. Zhang J, Lang J and Standifird W., Stress, Porosity, and Failure-Dependent Compressional And Shear Velocity Ratio
And Its Application To Wellbore Stability. Journal of Petroleum Science and Engineering. 2009;69(3-4):pp 193202.
2. Samuel O., Osisanya, (2012) Practical Approach to Solving Wellbore Instability Problems. 50th anniversary of the
SPE Distinguished Lecturer program.
3. Westergaard, H. M. (!940); Plastic State of Stress Around a Deep well, J. Boston Soc. Of Civil engineers, 12 January
1940, pp 156 164.
4. Gnirk, P.F (1972): The Mechanical Behavior of Uncased Wellbores Situated in Elastic/Plastic Media under
Hydrostatic Stress, Soc. Pet. Eng. 1. Feb. 1972, pp 49 59.
5. Sinha, B.K., A.N. Norris and Shu-Kong Chang, 1994; Borehole flexural modes in anisotropic formations, Geophysics,
59(7), pp. 10371052.
6. Geertsma, J (1978): Some Rock Mechanical Aspects of Oil and Gas Well Completions, paper EUR 38 presented at
the1978 SPE European Offshore Petroleum Conference and Exhibition, London, Oct. 1978, pp 24 26.
7. Hubbert, M.K. and Willis, D.G (1957).: Mechanics of Hydraulic Fracturing, J. Pet. Tech. Trans., AIME, 210 June
1957, pp 15366.
8. Al-Ajmi, A.M., and Zimmerman, R.W. 2006. Stability Analysis of Deviated Boreholes Using the Mog Coulomb
Failure Criterion, With Applications to Some Oil and Gas Reservoirs. In the IADC/SPE Asia Pacific Drilling
Technology Conference and Exhibition, Bangkok, Thailand, 1315 November 2006, Paper IADC/SPE 104035, pp.
114.
9. Sadiq, T., Nashawi, I. S. 2000.Using Neural Networks for Prediction of Formation Fracture Gradient. In the
SPE/Petroleum Society of CIM International Conference on Horizontal Well Technology, Calgary, Alberta, Canada,
6-8 November, 2000. Paper SPE 65463, pp. 18.
10. Demuth, H. and M., Beale. 1998. Neural network toolbox for use with MATLAB. Mathworks, Inc, USA. Users Guide,
Fifth Printing, Version 3.
11. Jahanbakhshi, R., Keshavarzi, R. and M.J., Azinfar. 2011. Intelligent Prediction of Uniaxial Compressive Strength for
Sandstone. In 45th, U.S. Rock Mechanics / Geomechanics Symposium, June 26 29, 2011, San Francisco, California,
paper ARMA 11189.

10

SPE-184371-MS

12. Keshavarzi, R., Jahanbakhshi, R. and M., Rashidi. 2011. Predicting Formation Fracture Gradient. In Oil And Gas
Wells: A Neural Network Approach. In 45th U.S. Rock Mechanics / Geomechanics Symposium, June 26 29, 2011,
San Francisco, California, paper ARMA 11114.
13. Sadiq, T., Nashawi, I. S. 2000. Using Neural Networks for Prediction of Formation Fracture Gradient. In the
SPE/Petroleum Society of CIM International Conference on Horizontal Well Technology, Calgary, Alberta, Canada,
6-8 November, 2000. Paper SPE 65463, pp. 18.
14. Reynold Chu. S. Shoureshi, R. and Tenorio, M, 1990: Neural Networks for Systems Identification, IEEE Control
Systems Magazine, Vol. 10, No.3, pp 3135.
15. Bishop, C.: Neural Networks for Pattern Recognition, Oxford University Press, NY (1995).
16. Chenevert, M. E. (1970): Shale Control With Balanced Activity Oil-Continous Muds, J. Pet Tech., Trans., AIME
249, Oct. 1970. Pp 1309 1316.
17. Bradley, W. B. (1978): Borehole Failures Near Salt Domes, paper SPE 7503 presented at the 1978 SPE Annual
Technical Conference and Exhibition, Houston. Oct, 1978.

Das könnte Ihnen auch gefallen