Sie sind auf Seite 1von 5

ACE technique

• Alternating Conditional Transform data to new space


Expectations (ACE) is first y   y
proposed by Breiman and xi  i  xi 
Friedman,1985
In tranformed space
• Iterative minimization procedure p
for computing optimal data   y    l  xl   
Transforms l 1

• Non-parametric regression The optimal transformations  * , i*


• Yields Maximum Correlations in are obtained by minimizati ng
Transformed Space  2

p

• GRaphic ACE developed by Dr. e  , 1 , 2 , 3 ,...  E  Y    l  X l  
2

 l 1  
Valko and Dr. Datta-Gupta
ACE Technique: example
GR Optimal Transform RHOB Optimal Transform

Two well logs (GR, RHOB)


1.2 2

3 2 3 2
1 y=1.3273*x -0.1825*x +0.0066331*x-8.4493e-005 y=-18191.5265*x +21184.9705*x -8213.7284*x+1060.31

to estimate lnKg
0.8 1.5

0.6

Prediction procedures:
0.4 1

RHOB Tr
GR Tr
0.2

0 0.5

1. GR -> GR_Tr
-0.2

-0.4 0

-0.6

RHOB -> RHOB_Tr -0.8


0 5 10 15 20

GR
25 30 35 40
-0.5
2.54 2.56 2.58 2.6 2.62 2.64

RHOB
2.66 2.68 2.7 2.72 2.74

2. lnKg_Tr = Sum_Tr = Optimal Regression, Correl: 0.74248 lnKg Optimal Inv Transform

GR_Tr + RHOB_Tr
4 8

3 2
y=-0.13903*x +2.1345*x -0.53968*x+0.09905
6
3

3. lnKg_Tr -> lnKg -> Kg


4
2

2
lnKg Tr

lnKg
1

0
-2

-1
-4

-2 -6
-2 -1 0 1 2 3 4 -1 -0.5 0 0.5 1 1.5 2 2.5 3 3.5 4

Sum Tr lnKg Tr
Stepwise algorithm
• Variable selection because of over-fitting problem and noises
from dependent variables
• Difficult to fit all 2N models if there are N independent variables
• Stepwise procedure: backward elimination and forward
addition
• AIC (Akaike Information Criterion) is a measure of the
goodness of fit of an estimated statistical model.
AIC  2k  nlog2RSS n  1
k is the number of variables

n is the number of samples

RSS is residual sum of squares of a model


Stepwise algorithm
• Stepwise regression is stepwise Start with a full model
procedure with linear regression
Purpose to add or
• New method combining ACE delete a variable
with stepwise procedure is
proposed Perform regression If linear regression,
New method: Stepwise regression wit
calculate AIC it is stepwise
• Stepwise procedure with AIC regression
could be used for variable
If a better AIC
selection Yes Is detected
No
Optimal model
Stepwise algorithm: example
n=100 A B C RSS AIC
Simple example: Full X X X 200.2 266.85

3 variables (k=3)
Step1
initial X X X 200.2 266.85

100 samples (n=100) -A


Single term deletion
X X 208.7 268.71

AIC  2k  nlog2RSS n  1 -B X X 202.3 265.82


-C X X 217.5 272.55

Finally, Variable A & C are Step2


initial X X 202.3 265.82
selected Single term deletion
-A X 209.4 267.02
-C X 215.2 269.56
Single term addition
+B X X X 200.2 266.85
Optimal X X 202.3 265.82

Das könnte Ihnen auch gefallen