Sie sind auf Seite 1von 72

Presented By

K.Indira

Under the Guidance of
Dr. S. Kanmani,
Professor,
Department of Information Technology,
Pondicherry Engineering College.


1
Empirical Study on Mining
Association Rules Using Population
Based Stochastic Search Algorithms
2
Objective
Introduction
References
Contents
Why Association Rule Mining
Existing Methods and its Limitations
Evolutionary Algorithms in AR mining
GA and PSO An Introduction
Empirical study
Conclusion
Publications
3
To propose an efficient methodology for mining
Association rules both effectively and efficiently using
population based search methods namely Genetic
Algorithm and Particle Swarm Optimization
Objective
4
Large quantities of data is being accumulated
There is a huge gap from the stored data to the knowledge
that could be construed from the data
Searching for meaningful information in large databases
has become a very important issue.
Association rules, Clustering and Classification are
methods applied for extracting information from
databases.
Association rule mining is the most widely applied method

Why Association Rule Mining
Extraction of interesting information or patterns
from data in large databases is known as data
mining.
Data Mining
5
Association rule mining finds interesting associations
and/or correlation relationships among large set of
data items.
Association Rule Mining
6
Tid Items bought
10 Milk, Nuts, Sugar
20 Milk, Coffee, Sugar
30 Milk, Sugar, Eggs
40 Nuts, Eggs, Bread
50 Nuts, Coffee, Sugar , Eggs,
Bread
Association Rules
Rules are of form X Y with
minimum support and confidence
Support, s, probability that a
transaction contains X Y
Confidence, c, conditional
probability that a transaction
having X also contains Y
Let minsup = 50%, minconf = 50%
Freq. Pat.: Milk:3, Nuts:3, Sugar:4, Eggs:3, {Milk,
Sugar}:3
Association rules:
Milk Sugar (60%, 100%)
Sugar Milk (60%, 75%)
Customer
buys sugar
Customer
buys both
Customer
buys milk
7
Apriori, FP Growth Tree, clat are some of the popular
algorithms for mining ARs.
Traverse the database many times.
I/O overhead, and computational complexity is more
Cannot meet the requirements of large-scale database
mining.
Does not fit in memory and is expensive to build
Time is wasted (especially if support threshold is high),
as the only pruning that can be done is on single items
Limitations of Existing System
Applicable in problems where no (good) method is available:
Discontinuities, non-linear constraints, multi-modalities.
Discrete variable space.
Implicitly defined models (if-then-else constructs).
Noisy problems.
Most suitable in problems where multiple solutions are required:
Multi-modal optimization problems.
Multi-objective optimization problems.
Parallel implementation is easier.
Evolutionary algorithms provide robust and efficient approach in exploring
large search space.

Uniqueness of Evolutionary Algorithm
8
GA and PSO An Introduction
Genetic algorithm (GA) and Particle swarm
optimization (PSO) are both population based
search methods and move from set of points
(population) to another set of points in a single
iteration with likely improvement using set of
control operators.


9
Genetic Algorithm
A Genetic Algorithm (GA) is a procedure used to
find approximate solutions to search problems
through the application of the principles of
evolutionary biology.

Particle Swarm Optimization
PSOs mechanism is inspired by the social and
cooperative behavior displayed by various
species like birds, fish etc including human
beings.
10
Association Rule
(AR) Mining
Population Based
Evolutionary Methods
Genetic Algorithm
(GA)
Particle Swarm
Optimization (PSO)
Mining Association
Rules using GA
Analyzing the role of
Control parameters in
GA for mining ARs
Mining ARs using
Self Adaptive GA
Elitist GA for
Association Rule
Mining
Mining Association
rules with PSO
Mining Association
Rules with chaotic
PSO
Mining Association
rules with Dynamic
Neighborhood
Selection in PSO
Mining Association
rules with Self
Adaptive PSO
Hybrid GA/PSO
(GPSO) for AR
Mining
Block Diagram of Research Modules
Datasets Used
Lenses
Haberman
Car Evaluation
Lenses
Age of the
patient
1: Young 2: Pre-
Presbyopic
3:Presbyopic
Spectacle
Prescription
1: Myopic 2:
Hypermetropic
Astigmatic 1: No 2: Yes
Tear Production
Rate
1: Reduced 2: Normal
Result 1: Hard
Contact
lenses
2: Soft Contact
Lenses
3: No lenses
Haberman
14
Age of the patient 30-83
Numeric
Patient's year of
operation
Numeric
Eg. 67
Number of positive
axillary nodes
detected
0-46
Numeric
Result 1= the patient
survived 5 years
or longer s
2 = the patient died
within 5 year
Car Evaluation
Buying price Very high High Medium Low
Maintenance
Price
Very high High Medium Low
Doors 2 3 4 5
Persons 2 4 More
Luggage boot Small Big Medium
Safety Low Medium High
Result Unaccepta
ble
Acceptabl
e
Good Very
good
16
Mining ARs using GA
Methodology

Selection : Tournament

Crossover Probability : Fixed ( Tested with 3 values)

Mutation Probability : No Mutation

Fitness Function :

Dataset : Lenses, Iris, Haberman from
UCI Irvine repository.

Population : Fixed ( Tested with 3 values)


17
Flow chart of the GA
Results Analysis
No. of Instances No. of Instances * 1.25 No. of Instances *1.5
Accuracy
%
No. of
Generations
Accuracy
%
No. of
Generations
Accuracy
%
No. of
Generations
Lenses 75 7 82 12 95 17
Haberman
71 114 68 88 64 70
Iris 77 88 87 53 82 45
Comparison based on variation in population Size.

Minimum Support & Minimum Confidence
Sup = 0.4 & con =0.4 Sup =0.9 & con =0.9 Sup = 0.9 & con = 0.2 Sup = 0.2 & con = 0.9
Accuracy
%
No. of
Gen
Accuracy
%
No. of
Gen.
Accuracy
%
No. of
Gen.
Accuracy
%
No. of
Gen
Lenses 22 20 49 11 70 21 95 18
Haberman
45 68 58 83 71 90 62 75
Iris 40 28 59 37 78 48 87 55
Comparison based on variation in Minimum Support and Confidence
19
Cross Over
Pc = .25 Pc = .5 Pc = .75
Accurac
y %
No. of
Generations
Accuracy % No. of
Generations
Accuracy
%
No. of
Generations
Lenses 95 8 95 16 95 13
Haberman 69 77 71 83 70 80
Iris 84 45 86 51 87 55
Dataset No. of
Instanc
es
No. of
attributes
Population
Size
Minimum
Support
Minimum
confidence
Crossover
rate
Accuracy
in %
Lenses 24 4 36 0.2 0.9 0.25 95
Haberman
306 3 306 0.9 0.2 0.5 71
Iris 150 5 225 0.2 0.9 0.75 87
Comparison of the optimum value of Parameters for
maximum Accuracy achieved
Comparison based on variation in Crossover Probability

20
Population Size Vs Accuracy
Minimum Support and Confidence Vs Accuracy
21
Values of minimum support, minimum confidence and mutation
rate decides upon the accuracy of the system than other GA
parameters
Crossover rate affects the convergence rate rather than the
accuracy of the system
The optimum value of the GA parameters varies from data to
data and the fitness function plays a major role in optimizing the
results
Inferences
22
Mining ARs using Self Adaptive GA in Java.
Methodology

Selection : Roulette Wheel

Crossover Probability : Fixed ( Tested with 3 values)

Mutation Probability : Self Adaptive





Fitness Function :

Dataset : Lenses, Iris, Car from
UCI Irvine repository.

Population : Fixed ( Tested with 3 values)


23
Procedure SAGA

Begin
Initialize population p(k);
Define the crossover and mutation rate;
Do
{
Do
{
Calculate support of all k rules;
Calculate confidence of all k rules;
Obtain fitness;
Select individuals for crossover / mutation;
Calculate the average fitness of the n and (n-1) the generation;
Calculate the maximum fitness of the n and (n-1) the generation;
Based on the fitness of the selected item, calculate the new crossover
and mutation rate;
Choose the operation to be performed;
} k times;
}
Self Adaptive GA
SELF ADAPTIVE
25
Dataset Traditional GA Self Adaptive GA
Accuracy No. of Generations Accuracy No. of Generations
Lenses 75 38 87.5 35
Haberman 52 36 68 28
Car Evaluation 85 29 96 21
Dataset Traditional GA Self Adaptive GA
Accuracy No. of
Generations
Accuracy No. of Generations
Lenses 50 35 87.5 35
Haberman 36 38 68 28
Car
Evaluation
74 36 96 21
ACCURACY COMPARISON BETWEEN GA AND SAGA WHEN PARAMETERS ARE
SET TO TERMINATION OF SAGA
ACCURACY COMPARISON BETWEEN GA AND SAGA WHEN PARAMETERS ARE IDEAL
FOR TRADITIONAL GA
Results Analysis
26
0
10
20
30
40
50
60
70
80
90
100
Lenses Haberman Car Evaluation
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

Dataset
Traditional GA Accuracy
Self Adaptive GA Accuracy
ACCURACY COMPARISON BETWEEN GA AND SAGA WHEN
PARAMETERS ARE IDEAL FOR TRADITIONAL GA
27
0
10
20
30
40
50
60
70
80
90
100
Lenses Haberman Car Evaluation
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

Dataset
Traditional GA
Self Adaptive GA
ACCURACY COMPARISON BETWEEN GA AND SAGA
WHEN PARAMETERS ARE ACCORDING TO
TERMINTAION OF SAGA
Inferences

Self Adaptive GA gives better accuracy than Traditional GA.

28
GA with Elitism for Mining ARs
Methodology

Selection : Elitism with roulette wheel

Crossover Probability : Fixed to P
c

Mutation Probability : Self Adaptive

Fitness Function : Fitness(x) = con(x)*(log(sup(x) *
length(x) + 1)

Dataset : Lenses, Iris, Car from UCI Irvine
repository.

Population : Fixed


29
No. Of
Iterations
Lenses Car Evaluation Haberman
4 90 94.4 70
6 87.5 91.6 75
8 91.6 92.8 91.6
10 90 87.5 75
15 87.5 90 83.3
20 91.6 87.5 91.6
25 87.5 87.5 92.5
30 83.3 93.75 83.3
50 90 75 75
Predictive Accuracy for Mining AR based on GA with Elitism
Results Analysis
30
31
0
10
20
30
40
50
60
70
80
90
100
6 8 10 15 20 25 30 50
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

No. of Iterations
Lenses
Car Evaluation
Haberman
Predictive Accuracy for Mining AR based on GA
with Elitism
32
No of matches vs. No of iterations
No. Of
Iterations
Lenses (ms) Car Evaluation
(ms)
Haberman
(ms)
4 15 547 125
6 16 721 156
8 31 927 187
10 31 1104 203
15 32 1525 281
20 47 1967 359
25 63 2504 421
30 78 2935 530
50 94 4753 998
Execution Time for Mining AR based on GA with Elitism
33
34
0
1000
2000
3000
4000
5000
6000
4 6 8 10 15 20 25 30 50
E
x
e
c
u
t
i
o
n

t
i
m
e

(
m
s
)

No. of Iterations
Haberman (ms)
Car Evaluation (ms)
Lenses (ms)
Inferences
Marginally better accuracy arrived
Computational Efficiency found to be optimum
Elitism when introduced helps in retaining
chromosomes with good fitness values for next
generation

35
36
Mining ARs using PSO
Methodology

Each data itemset are represented as particles

The particles moves based on velocity



The particles position are updated based on



Particle Swarm Optimization (PSO)
Flow chart depicting the General PSO Algorithm:
Start
Initialize particles with random position
and velocity vectors.
For each particles position (p)
evaluate fitness
If fitness(p) better than
fitness(pbest) then pbest= p
L
o
o
p

u
n
t
i
l

a
l
l

p
a
r
t
i
c
l
e
s

e
x
h
a
u
s
t

Set best of pBests as gBest
Update particles velocity (eq. 1) and
position (eq. 3)
L
o
o
p

u
n
t
i
l

m
a
x

i
t
e
r

Stop: giving gBest, optimal solution.
Dataset Name
Traditional
GA
Self
Adaptive
GA
PSO
Lenses 87.5 91.6 92.8
Haberman 75.5 92.5 91.6
Car evaluation 85 94.4 95
Results Analysis
0
200
400
600
800
1000
1200
4 6 8 10 15 20 25 30 50
E
x
e
c
u
t
i
o
n

T
i
m
e

m
s
e
c

No. of iterations
Haberman
PSO
SAGA
0
20
40
60
80
100
4 6 8 101520253050
E
x
e
c
u
t
i
o
n

T
i
m
e

m
s
e
c

No. of iterations
Lenses
PSO
SAGA
0
200
400
600
800
1000
1200
4 6 8 10 15 20 25 30 50
E
x
e
c
u
t
i
o
n

T
i
m
e

m
s
e
c

No. of Iterations
Car Evaluation
PSO
SAGA
Predictive Accuracy
Execution Time
38
Inferences

PSO produce results as effective as self adaptive GA

Computational effectiveness of PSO marginally fast when
compared to SAGA.

In PSO only the best particle passes information to others and
hence the computational capability of PSO is marginally better
than SAGA.


39
BACK
40
Mining ARs using Chaotic PSO
The new chaotic map model is formulated as
Methodology
Initial point u
0
and V
0
to 0.1
The velocity of each particle is updated by
Compute x
i
(k+1)
Compute (f(x
i
(k+1))
Reorder the particles
Generate neighborhoods I =1
k K
i = i +1
K = k+1
Start
K =1
Initialize x
i
(k), v
i
(k)
Compute f(x
i
(k))
Determine best particles in the
neighborhood of i
Update previous best if necessary
I N
Stop
no
no
yes
yes
Mining ARs using
Chaotic PSO
41
42
80
82
84
86
88
90
92
94
96
98
100
Haberman lens car evaluation
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

SAGA
PSO
CPSO
ACCURACY COMPARISON
43
75
80
85
90
95
100
SAGA pso cpso
4
6
8
10
15
20
25
30
50
Convergence Rate Comparison for Lenses
44
40
50
60
70
80
90
100
SAGA pso cpso
4
6
8
10
15
20
25
30
50
Convergence Rate Comparison for Car
Evaluation
45
0
10
20
30
40
50
60
70
80
90
100
SAGA pso cpso
4
6
8
10
15
20
25
30
50
Convergence Rate Comparison for Habermans
Survival

Inferences

Better accuracy than PSO
The Chaotic Operators could be changed by altering the initial
values in chaotic operator function
The balance between exploration and exploitation is
maintained

46
47
Mining ARs using Neighborhood Selection
in PSO
Methodology
The concept of local best particle (lbest) replacing the particle
best (pbest) is introduced
The neighborhood best (lbest) selection is as follows;
Calculate the distance of the current particle from other
particles
Find the nearest m particles as the neighbor of the current

particle based on distance calculated
Choose the local optimum lbest among the neighborhood
in terms of fitness values
48
Interestingness Measure

The interestingness measure for a rule is taken from relative
confidence and is as follows:



Where k is the rule, x the antecedent part of the rule and y
the consequent part of the rule k.
49
88
89
90
91
92
93
94
95
96
97
98
Haberman lens car evaluation
P
r
e
d
i
c
t
i
v
e

A
c
u
r
a
c
y

(
%
)

saga
pso
Npso
Predictive Accuracy Comparison for Dynamic
Neighborhood selection in PSO
50
Dataset Interestingness Value
Lens 0.82
Car Evaluation 0.73
Habermans Survival 0.8
Measure of Interestingness
51
0
200
400
600
800
1000
1200
1400
1600
4 6 8 10 15 20 25 30 50
Lenses PSO
Lenses NPSO
Haberman's Survival PSO
Haberman's Survival NPSO
Car Evaluation PSO
Car Evaluation NPSO
Execution Time Comparison for Dynamic
Neighborhood selection in PSO

52
0
10
20
30
40
50
60
70
80
90
100
PSO NPSO
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

4
6
8
10
15
20
25
30
50
Predictive Accuracy over Generation for a) Car
Evaluation b) Lenses c) Habermans Survival datasets

0
10
20
30
40
50
60
70
80
90
100
PSO NPSO

P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)


4
6
8
10
15
20
25
30
50
0
10
20
30
40
50
60
70
80
90
100
PSO NPSO
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

4
6
8
10
15
20
25
30
50
53
The avoidance of premature convergence at local optimal
points tend to enhance the results
The selection of local best particles based on neighbors
(lbest) rather than particles own best (pbest) enhances
the accuracy of the rules mined
Inferences
54
Mining ARs using Self Adaptive
Chaotic PSO
A slight variant of the PSO is called inertia-weight PSO, in which a
weight parameter Is added to the velocity equation adopted



where, w is the inertia weight. The variable w plays the role of
balancing the global search and local search.
A method of adaptive mutation rate is used
= max (max min) g/G

where, g is the generation index representing the
current number of evolutionary generations, and G is a
redefined maximum number of generations. Here, the
maximal and minimal weights max and min are usually
set to 0.9 and 0.4, respectively.
Effect of changing w
Dataset
Highest PA achieved within 50 runs of iterations
No weight
(Normal PSO)
w = 0.5 w = 0.7
Lenses 87.5 88.09 84.75
Haberman 87.5 96.07 99.80
Car 96.4 99.88 99.84
POP Care 91.6 98.64 97.91
Zoo 83.3 96.88 98.97
Lenses
0
10
20
30
40
50
60
70
80
90
100
5 10 15 25 50
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

No of generations
Predictive Accuracy CPSO
Predictive Accuracy
Weighted CPSO
Predictive Accuracy Self
Adaptive CPSO
Habermans Survival
75
80
85
90
95
100
5 10 15 25 50
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

No of generations
Predictive Accuracy
CPSO
Predictive Accuracy
Weighted CPSO
Predictive Accuracy Self
Adaptive CPSO
Post operative Patient Care
0
10
20
30
40
50
60
70
80
90
100
5 10 15 25 50
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

No. of Generations
Predictive Accuracy
CPSO
Predictive Accuracy
Weighted CPSO
Predictive Accuracy Self
Adaptive CPSO
Zoo
82
84
86
88
90
92
94
96
98
100
5 10 15 25 50
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

No. of Generations
Predictive Accuracy
CPSO
Predictive Accuracy
Weighted CPSO
Predictive Accuracy Self
Adaptive CPSO
Car Evaluation
98.2
98.4
98.6
98.8
99
99.2
99.4
99.6
99.8
100
5 10 15 25 50
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

No of generations
Predictive Accuracy CPSO
Predictive Accuracy
Weighted CPSO
Predictive Accuracy Self
Adaptive CPSO
61
In term of computational efficiency SACPSO is
faster than GA
Setting of appropriate values for the control
parameters involved in these heuristics methods
is the key point to success in these methods


Inferences
62
x <- copy(x_best)
For 1 to Elite
x <- Select an Individual
x <- Update Velocity
x <- Update Position
x1 <- Select an Individual
x2 <- Select an Individual
Crossover(x1, x2)
Mutate(x1, x2)
For 1 to (pop_size-Elite) *
breed_Ratio
For 1 to (pop_size-Elite)*(1-
Breed_Ratio

Mining AR using Hybrid GA/PSO
63
Conclusion
When Genetic algorithm used for mining association
rules Improvement in predictive accuracy achieved
Particle swarm optimization when adopted for mining
association rules produces results closer to GA but with
minimum execution time
The premature convergence being the major drawback
of PSO was handled by introducing inertia weights,
chaotic maps, neighborhood selection adaptive inertia
weight
Papers Published
64
K.Indira, Dr.S.Kanmani, Framework for Comparison of Association Rule
Mining Using Genetic Algorithm, In : International Conference On
Computers, Communication & Intelligence , 2010.
K.Indira, Dr.S.Kanmani, Mining Association Rules Using Genetic Algorithm:
The role of Estimation Parameters , In : International conference on
advances in computing and communications, Communication in Computer
and Information Science, Springer LNCS, Volume 190, Part 8, 639-648, 2011
K.Indira, Dr. S. Kanmani

, Gaurav Sethia.D, Kumaran.S, Prabhakar.J
,
Rule
Acquisition in Data Mining Using a Self Adaptive Genetic Algorithm, In :
First International conference on Computer Science and Information
Technology, Communication in Computer and Information Science, Springer
LNCS Volume 204, Part 1, 171-178, 2011.
K.Indira, Dr. S.Kanmani, Prasanth, Harish, Jeeva, Population Based Search
Methods in Mining Association Rules , In : Third International Conference
on Advances in Communication, Network, and Computing CNC 2012,
LNICST pp. 255261, 2012.
Conferences
65
Journal
K.Indira, Dr. S.Kanmani, Performance Analysis of Genetic Algorithm for
Mining Association Rules, IJCSI International Journal of Computer Science
Issues, Vol. 9, Issue 2, No 1, 368-376, March 2012
K.Indira, Dr. S.Kanmani, Rule Acquisition using Genetic Algorithm,
accepted for publication in Journal of Computing
K.Indira, Dr. S.Kanmani, Enhancing Particle Swarm optimization using
chaotic operators for Association Rule Mining, communicated to
International Journal of Computer Science and Techniques
K.Indira, Dr. S.Kanmani, Association Rule Mining by Dynamic Neighborhood
Selection in Particle Swarm Optimization , communicated to world science
Publications

References
Jing Li, Han Rui-feng, A Self-Adaptive Genetic Algorithm Based On Real-
Coded, International Conference on Biomedical Engineering and
computer Science , Page(s): 1 - 4 , 2010

Chuan-Kang Ting, Wei-Ming Zeng, Tzu- Chieh Lin, Linkage Discovery through
Data Mining, IEEE Magazine on Computational Intelligence, Volume 5,
February 2010.

Caises, Y., Leyva, E., Gonzalez, A., Perez, R., An extension of the Genetic
Iterative Approach for Learning Rule Subsets , 4th International Workshop
on Genetic and Evolutionary Fuzzy Systems, Page(s): 63 - 67 , 2010

Shangping Dai, Li Gao, Qiang Zhu, Changwu Zhu, A Novel Genetic Algorithm
Based on Image Databases for Mining Association Rules, 6th IEEE/ACIS
International Conference on Computer and Information Science, Page(s):
977 980, 2007

Peregrin, A., Rodriguez, M.A., Efficient Distributed Genetic Algorithm for
Rule Extraction,. Eighth International Conference on Hybrid Intelligent
Systems, HIS '08. Page(s): 531 536, 2008

66
67
Mansoori, E.G., Zolghadri, M.J., Katebi, S.D., SGERD: A Steady-State
Genetic Algorithm for Extracting Fuzzy Classification Rules From Data,
IEEE Transactions on Fuzzy Systems, Volume: 16 , Issue: 4 , Page(s): 1061
1071, 2008..

Xiaoyuan Zhu, Yongquan Yu, Xueyan Guo, Genetic Algorithm Based on
Evolution Strategy and the Application in Data Mining, First International
Workshop on Education Technology and Computer Science, ETCS '09,
Volume: 1 , Page(s): 848 852, 2009

Hong Guo, Ya Zhou, An Algorithm for Mining Association Rules Based on
Improved Genetic Algorithm and its Application, 3rd International
Conference on Genetic and Evolutionary Computing, WGEC '09, Page(s):
117 120, 2009

Genxiang Zhang, Haishan Chen, Immune Optimization Based Genetic
Algorithm for Incremental Association Rules Mining, International
Conference on Artificial Intelligence and Computational Intelligence, AICI
'09, Volume: 4, Page(s): 341 345, 2009
References Contd..
68
Maria J. Del Jesus, Jose A. Gamez, Pedro Gonzalez, Jose M. Puerta, On the
Discovery of Association Rules by means of Evolutionary Algorithms, from
Advanced Review of John Wiley & Sons , Inc. 2011
Junli Lu, Fan Yang, Momo Li, Lizhen Wang, Multi-objective Rule Discovery
Using the Improved Niched Pareto Genetic Algorithm, Third International
Conference on Measuring Technology and Mechatronics Automation, 2011.
Hamid Reza Qodmanan, Mahdi Nasiri, Behrouz Minaei-Bidgoli, Multi
Objective Association Rule Mining with Genetic Algorithm without specifying
Minimum Support and Minimum Confidence, Expert Systems with
Applications 38 (2011) 288298.
Miguel Rodriguez, Diego M. Escalante, Antonio Peregrin, Efficient Distributed
Genetic Algorithm for Rule Extraction, Applied Soft Computing 11 (2011) 733
743.
J.H. Ang, K.C. Tan , A.A. Mamun, An Evolutionary Memetic Algorithm for Rule
Extraction, Expert Systems with Applications 37 (2010) 13021315.
References Contd..
R.J. Kuo, C.M. Chao, Y.T. Chiu, Application of particle swarm optimization to
association rule mining, Applied Soft Computing 11 (2011) 326336.
Bilal Alatas , Erhan Akin, Multi-objective rule mining using a chaotic particle
swarm optimization algorithm, Knowledge-Based Systems 22 (2009) 455
460.
Mourad Ykhlef, A Quantum Swarm Evolutionary Algorithm for mining
association rules in large databases, Journal of King Saud University
Computer and Information Sciences (2011) 23, 16.
Haijun Su, Yupu Yang, Liang Zhao, Classification rule discovery with DE/QDE
algorithm, Expert Systems with Applications 37 (2010) 12161222.
Jing Li, Han Rui-feng, A Self-Adaptive Genetic Algorithm Based On Real-
Coded, International Conference on Biomedical Engineering and
computer Science , Page(s): 1 - 4 , 2010
Chuan-Kang Ting, Wei-Ming Zeng, Tzu- Chieh Lin, Linkage Discovery
through Data Mining, IEEE Magazine on Computational Intelligence,
Volume 5, February 2010.





69
References Contd..
70

Caises, Y., Leyva, E., Gonzalez, A., Perez, R., An extension of the Genetic
Iterative Approach for Learning Rule Subsets , 4th International Workshop
on Genetic and Evolutionary Fuzzy Systems, Page(s): 63 - 67 , 2010
Xiaoyuan Zhu, Yongquan Yu, Xueyan Guo, Genetic Algorithm Based on
Evolution Strategy and the Application in Data Mining, First International
Workshop on Education Technology and Computer Science, ETCS '09,
Volume: 1 , Page(s): 848 852, 2009

Miguel Rodriguez, Diego M. Escalante, Antonio Peregrin, Efficient
Distributed Genetic Algorithm for Rule extraction, Applied Soft Computing
11 (2011) 733743.

Hamid Reza Qodmanan , Mahdi Nasiri, Behrouz Minaei-Bidgoli, Multi
objective association rule mining with genetic algorithm without specifying
minimum support and minimum confidence, Expert Systems with
Applications 38 (2011) 288298.

Yamina Mohamed Ben Ali, Soft Adaptive Particle Swarm Algorithm for Large
Scale Optimization, IEEE 2010.



References Contd..
71

Junli Lu, Fan Yang, Momo Li1, Lizhen Wang, Multi-objective Rule Discovery
Using the Improved Niched Pareto Genetic Algorithm, 2011 Third International
Conference on Measuring Technology and Mechatronics Automation.

Yan Chen, Shingo Mabu, Kotaro Hirasawa, Genetic relation algorithm with
guided mutation for the large-scale portfolio optimization, Expert Systems
with Applications 38 (2011), 33533363.

R.J. Kuo, C.M. Chao, Y.T. Chiu, Application of particle swarm optimization to
association rule mining, Applied Soft Computing 11 (2011), 326336

Feng Lu, Yanfeng Ge, LiQun Gao, Self-adaptive Particle Swarm Optimization
Algorithm for Global Optimization, 2010 Sixth International Conference on
Natural Computation (ICNC 2010)

Fevrier Valdez, Patricia Melin, Oscar Castillo, An improved evolutionary
method with fuzzy logic for combining Particle Swarm Optimization and
Genetic Algorithms, Applied Soft Computing 11 (2011) ,26252632


References Contd..
72

Das könnte Ihnen auch gefallen