Beruflich Dokumente
Kultur Dokumente
Regular Paper
Department of ECE, National Institute of Science and Technology, Berhampur 761008, India
School of Electrical Sciences, Indian Institute of Technology Bhubaneswar, India
c
IDCOM, The University of Edinburgh, Edinburgh, UK
d
Department of Computer Science and Information Technology, Guru Ghasidas Vishwavidyalaya (Central University), Bilaspur 495009, India
b
a r t i c l e i n f o
abstract
Article history:
Received 3 March 2012
Received in revised form
30 October 2012
Accepted 1 November 2012
Available online 24 November 2012
This paper proposes a distributed DOA estimation technique using clustering of sensor nodes and
distributed PSO algorithm. The sensor nodes are suited by clustered to act as random arrays. Each
cluster estimates the source bearing by optimizing the Maximum Likelihood (ML) function locally with
cooperation of other clusters. During the estimation process each cluster shares its best information
obtained by Diffusion Particle Swarm Optimization (DPSO) with other clusters so that the global
estimation is achieved. The performance of the proposed technique has been evaluated through
simulation study and is compared with that of obtained by the centralized and decentralized MUltiple
SIgnal Classication (MUSIC) algorithms and distributed in-network algorithm. The results demonstrate improved performance of the proposed method compared to others. However, the new method
exhibits slightly inferior performance compared to the centralized Particle Swarm OptimizationMaximum Likelihood (PSO-ML) algorithm. Further the proposed method offers low communication
overheads compared to other methods.
& 2012 Elsevier B.V. All rights reserved.
Keywords:
Array signal processing
DOA estimation
ML estimation
Particle swarm optimization
Diffusion algorithm
1. Introduction
Accurate Direction of Arrival (DOA) estimation is an important
problem for tracking and localizing of sources. In DOA estimation,
the outputs from a set of sensors are analyzed to determine the
bearings of signals arriving at the sensor array. The maximum
likelihood (ML) method is one of the best technique used in
source localization problems [1]. The ML function can be formulated from the signalnoise model equation which holds its
maxima when the tried angle of arrival is exactly equal to the
actual incident angle. Standard iterative approaches have been
proposed to solve the resulting non-linear optimization problem.
Due to the high computational load of the multivariate non-linear
optimization problem required in ML estimation, several high
resolution suboptimal techniques have been suggested in the
literature which includes the minimum variance method of Capon
[2], the MUltiple SIgnal Classication (MUSIC) method of Schmidt [3],
Estimation of Signal Parameters via Rotational Invariance Techniques
(ESPRIT) [4], etc. In general, the performance of suboptimal techniques is inferior to the ML technique in low signal-to-noise ratio (SNR)
or for small number of data samples.
2210-6502/$ - see front matter & 2012 Elsevier B.V. All rights reserved.
http://dx.doi.org/10.1016/j.swevo.2012.11.001
48
2. Problem formulation
Consider a SN with N sensor nodes distributed over some
geographical region to collect data for DOA estimation. These
nodes receive the signal from M o N number of narrowband fareld sources from direction angles h y1 , y2 , . . . , yM T . Let all the
sensor nodes in the network form an arbitrary array where ag y
as a complex N 1 response vector of the global sensor array in
the direction of y. The global N M steering matrix for M number
of sources is Ag h ag y1 , . . . ,ag yM which depends on the
position of the nodes in the network. The n,mth element of
Ag h is modeled as the complex gain with respect to a reference
point of the mth signal at the nth sensor is represented as
2p
Ag n,m exp j xn sin ym yn cos ym ,
n 1,2, . . . ,N,
m 1,2, . . . ,M
i 1,2, . . . ,L
3
H
1
1
expfyH
g Rg yg g
detpRg h
L
Y
1
1
expfyH
g iRyg yg ig
det
p
Rg h
i1
L
Y
i1
pg yg i9h
X
ln pg yg i9h
For convenience, we suppress the dependency of f on the parameter in the notation and let f yi9h f h. The use of simple
product of probabilities assumes that the snapshots are independent random events. The log-likelihood expression is convenient
because it transforms the product to summation and in addition
for Gaussian densities the log function inverts the exponential
function.
Further, following [32] the ML estimates of DOA is obtained as
P?
Ag
H
? ^
^
trPAg R g
f g h logPAg R g PAg
7
NM
where tr is the trace of the bracketed matrix; PAg Ag
1 H
AH
Ag is the projection of matrix Ag and P?
g Ag
Ag IPAg is the
orthogonal complementary. The sample covariance matrix R^ g is
dened as
L
1X
y iyH
R^ g
g i
Li1 g
h^ arg min f g h
49
N
Y
pk yk i9h
13
k1
k1
Using (12) the global cost function (14) can be written in additive
form as the sum of local cost functions f k h given as
f g h
N
X
f k yk i9h
15
k1
i 1,2, . . . ,L
L
Y
1
1
expfyH
k iRk y k ig
det
p
R
k
i1
11
L
Y
i1
pk yk i9h
X
ln pk yk i9h
i 1,2, . . . ,L
16
L
Y
1
c
expfycj iH R1
ycj yj ig
det
p
Rycj y
i1
17
10
f j h ln
L
Y
pcj yc i9h
X
ln pcj ycj i9h
18
i1
Nc
Y
19
j1
12
50
i1
Using (18) the global cost function can be rewritten as the sum of
local cost functions of the corresponding clusters as
f g h
Nc
X
f j ycj i9h
21
j1
23
j 1 i A N cj
Jxi cj J2
In distributed optimization methods using consensus algorithm [35], the goal of each cluster is to solve the optimization
c
problem in a cooperative manner. Let f j h represents the local
objective function of jth cluster which is known to that particular
cluster only. Then the optimization problem is to minimize
Nc
X
f j h
25
j1
f g h
24
subject to h A RM .
Since the ML function is multimodal, its cost functions are not
convex, but shares a common global solution (all provides unbiased
bearing estimates). In the vicinity i.e., a small region around the
solution, all the functions are convex. Therefore it is possible to
initialize an optimization procedure within this region.
The effectiveness of PSO (or CLPSO) algorithm is that in a
multimodal problem it can search the region where the global
solution lies. Further it enables the individual clusters to interact
with other clusters to make the search process faster and
accurate. So it needs an algorithm that serves a basic mechanism
for distributing the optimization among the clusters.
In this formulation each cluster runs the PSO (or CLPSO)
algorithm for optimizing its own cost function and simultaneously shares the best solution to other clusters. In particular,
each cluster starts with an initial estimate hj 0 and update its
estimate at discrete times i 1,2, . . .. Let hj i be the bearing
vector estimate by jth cluster at time i. While updating sensor
node combines its current estimate hj i with the estimate hk i
received from other clusters l. Hence update equation is given as
hcj i 1
Nc
X
ajl ihcl i
26
l1
27
28
pi1
gj
Nc
X
ajn pi1
gn
29
n1
31
51
52
Algorithm
algorithm.
2. Main
steps
of
block
distributed
in-cluster
Setup Problem:
As in Algorithm 1
Swarm Initialization at each cluster:
As in Algorithm 1
for each iteration do
for each cluster do
for each particle do
9 These steps are same as in Algorithm 1
end
If iteration 4 K & moditeration,B 0 then
B
9Share its gbest to all neighbor nodes;
end
end
Carry out the local diffusion operation according to 29;
Check termination criterion;
end
34
53
Malg3 ONMId
6.1. Example
35
36
Mcent ONLdhN
37
3
ONL0 dhN
38
Thus the centralized MUSIC algorithm needs more communication than the centralized PSO-ML algorithm. The overall comparison between all the algorithms is given in Table 1. The value
of hc is small because it may approximately equal to dhN =NC e.
Therefore the saving in overall communication overhead is
3
ONMI=Nc Lhc over distributed in-network algorithm. Specially
for large network the ratio N=Nc is large, hence the saving of
communication overhead is large. The distributed in-network
algorithm needs less communication overhead compared to other
centralized algorithms. Therefore it may be conclude that the
proposed distributed in-cluster algorithm uses less communication overhead compared to others.
20
21
15
12
18
10
17
7
5
3
4
2
14
19
24
10
10
15
6. Simulation results
22
23
13
16
11
15
1
20
9
20
20
10
10
20
Table 1
Comparison of communication overheads for different algorithms.
Algorithms
Overall communication overhead
ON c Ldhc
ONLdhN
Centralized MUSIC
3
ONL0 dhN
54
22
21
15
10
17
12
18
3
2
14
19
24
10
10
23
13
16
11
20
20
20
15
15
15
10
10
15
20
15
10
10
Centroid 1
10
RMSE(in degree)
5
yaxix
0
Centroid 2
Centroid 3
10
10
Centralized
PSOML
Centralized
CLPSOML
Centralized MUSIC
Decentralized
MUSIC
101
Centralized CRLB
Mean RMSE by Alg 1A
Mean RMSE by Alg 1B
15
15
At node 24 by Alg 3
10
10
15
20
10
xaxis
20
15
10
10
15
20
SNR(in dB)
Fig. 6. RMSE vs. SNR plots in DOA estimation by centralized PSO-ML and CLPSO-ML,
centralized and decentralized MUSIC, proposed distributed in-clustering Algorithms
1A, 1B and 2, distributed in-network Algorithm 3, and theoretical centralized CRLB.
11
10.8
Probability of Resolution
Mean distance
10.6
10.4
10.2
10
9.8
9.6
9.4
0.9
Centralized CLPSOML
Centralized PSOML
0.8
0.7
Mean PR by Alg 1B
1.5
2.5
3.5
Iteration
Fig. 5. Variation of objective function with iteration in Example 1.
Mean PR by Alg. 3
0.6
0.5
Centralized MUSIC
0.4
Decentralized MUSIC
0.3
0.2
At node 24 by Alg. 3
0.1
0
20
Mean PR by Alg 2
Mean PR by Alg 1A
15
10
10
15
20
SNR(in dB)
Fig. 7. PR vs. SNR plots in DOA estimation by centralized PSO-ML and CLPSO-ML,
centralized and decentralized MUSIC, proposed distributed in-clustering Algorithms
1A, 1B and 2 and distributed in-network Algorithm 3.
102
At Cluster 1 by Alg 1A
Centralized CRLB
Centralized PSOML
RMSE(in degree)
101
At Cluster 1 by PSOML
CRLB at Cluster 1
At Cluster 1 by Alg. 2
At Cluster 1 by Alg 1B
100
16
19
101
13
10
18
15
10
10
15
SNR(in dB)
23
26
10
30
0.9
At cluster 1 by Alg 1A
Centralized PSOML
At Cluster 1 by PSOML
At Cluster 1 by Alg. 2
At cluster 1 by Alg 1B
Centralized CLPSOML
0.8
0.7
0.6
22
35
21
37
29
25
24
20
20
15
10
28
27
15
31
32
39
33
36
40
38
10
15
20
20
0.5
15
0.4
Centroid 2
10
0.3
yaxis
Probability of Resolution
Fig. 8. RMSE vs. SNR plots in DOA estimation by centralized PSO-ML and CLPSO-ML,
at Cluster 1 by using Algorithms 1A, 1B and 2, at Cluster 1 by PSO-ML, theoretical
CRLB at Cluster 1 and theoretical centralized CRLB.
20
20
34
11
14
20
15
5
3
10
17
12
15
102
10
55
0.2
0.1
5
Centroid 1
0
Centroid 3
0
20
15
10
10
15
20
SNR(in dB)
Centroid 4
10
Fig. 9. PR vs. SNR plots in DOA estimation by centralized PSO-ML and CLPSO-ML,
at Cluster 1 by using Algorithms 1A, 1B and 2 and at Cluster 1 by PSO-ML and
CLPSO-ML.
15
20
15
10
10
15
xaxis
Fig. 11. Variation of cluster heads with iteration in Example 2.
11
10.5
10
Mean distance
9.5
9
8.5
8
7.5
7
10
12
14
16
Iteration
Fig. 12. Variation of objective function with iteration in Example 2.
56
102
At Clusterb1 by Alg 1A
Centralized CRLB
Centralized PSOML
CRLB at Cluster 1
At Cluster 1 by PSOML
At Cluster 1 by Alg 1B
At Cluster 1 by CLPSOML
Centralized CLPSOML
RMSE(in degree)
101
100
101
102
103
20
15
10
10
15
20
SNR(in dB)
Fig. 13. RMSE vs. SNR plots in DOA estimation by PSO-ML and CLPSO-ML,
distributed in-clustering Algorithms 1A and 1B, and theoretical centralized CRLB.
1
0.9
At Cluster 1 by Alg 1A
Centralized CLPSOML
Centralized PSOML
At cluster 1 by PSOML
At Cluster 1 by CLPSOML
Probability of Resolution
0.8
0.7
At Cluster 1 by Alg 1B
0.6
0.5
0.4
0.3
0.2
0.1
0
20
15
10
10
15
3 dB. But the same network gives better performance than the
optimum theoretical value, if the estimation is co-operative. It is
because in distributed estimation by using consensus algorithm
for a fully connected network, each agent tries to achieve the
global performance. In case of Algorithm 2, the performance is
slightly inferior to that of Algorithm 1A and 1B. But the block
method saves communication overheads. So when energy is
major constraint for designing WSNs, the Algorithm 2 is preferable with slight compromise in the performance.
6.2. Example 2
In the second example, the proposed technique is examined in the
presence of two closely spaced sources by using a large sensor
network of N40 nodes. The topology is shown in Fig. 10 having
average node degree d6. This value can be reduced by minimizing
communication link between nodes (i.e. by using multi hop network)
because performance of the algorithm does not depend on connectivity of the nodes unlike distributed in-network algorithm. The
clustering operation is carried out by unsupervised k-means algorithm. In this case the network is divided into four clusters having 11,
10, 9 and 10 cluster nodes (the average cluster size is 10) respectively.
The nodes numbered in magenta, green, black and red belongs to
clusters 1, 2, 3 and 4 respectively. The central nodes 2, 22, 17 and 39
are chosen CHs of clusters 1, 2, 3, and 4 respectively. The variation of
cluster head and the objective function with respect to iteration are
given in Figs. 11 and 12 respectively.
The distributed SN receives signals from two sources present at
DOAs 801 and 821 relative to x-axis. The number of snapshots is 20. In
Fig. 13, the RMSE values obtained by global network using PSO-ML,
CLPSO-ML, small network at cluster 1 without any co-operation by
PSO-ML and distributed in cluster Algorithm 1A and 1B are plotted
and compared with the theoretical CRLB. The probability of resolution
for the same methods is shown in Fig. 14. As evidents from Figs. 13
and 14 the distributed in-cluster algorithms at cluster 1 outperform
the individual cluster without co-operation. But in this distributed
estimation the centralized networks performance is not achieved.
It is because of the nature of likelihood function. However, if the
number of sensors is more, then the minima of the negative likelihood function are clearly distinguishable even at lower SNRs, but it
provides better performance by involving high communication overhead. The performance comparison in terms of RMSE is listed in
Table 2 at 5 dB of SNR and after 100 iterations. The results of this
table demonstrate that due to distributed estimation, nearly 6 dB and
7 dB gain in RMSE is achieved over non-cooperative estimation using
DPSO-ML and DCLPSO-ML algorithms respectively. The distributed
RMSE is 12 dB less than the centralized one. But at the same time
substantial saving O100 of communication overhead is achieved.
The CLPSO based DOA estimation algorithms in both centralized and
distributed manner provides slightly better performance compared to
that of the PSO based algorithms.
20
SNR(in dB)
7. Conclusion
Fig. 14. PR vs. SNR plots in DOA estimation by centralized PSO-ML and CLPSO-ML,
proposed distributed in-clustering Algorithms 1A and 1B and distributed innetwork Algorithm 3.
Table 2
Comparison of performances for different algorithms.
Algorithms
RMSE performance in dB
Overall communication
overhead
At cluster 1 using
distributed PSO-ML
At cluster 1 using
distributed CLPSO-ML
Centralized
PSO-ML
Centralized
CLPSO-ML
14.63
15.62
8.32
16.73
16.78
O9:6 103
O9:6 103
O9:6 103
O6 105
O6 105
Acknowledgment
Thanks to the authors of the paper [28] for providing the
CLPSO code.
References
[1] P. Stoica, K. Sharman, Maximum likelihood methods for direction-of-arrival
estimation, IEEE Transactions on Acoustics, Speech and Signal Processing 38
(July (7)) (1990) 11321143.
[2] J. Capon, High-resolution frequency-wavenumber spectrum analysis, Proceedings of the IEEE 57 (August (8)) (1969) 14081418.
[3] R. Schmidt, Multiple emitter location and signal parameter estimation, IEEE
Transactions on Antennas and Propagation 34 (3) (1986) 276280.
[4] R. Roy, T. Kailath, ESPRIT-estimation of signal parameters via rotational
invariance techniques, IEEE Transactions on Acoustics, Speech and Signal
Processing 37 (July (7)) (1989) 984995.
[5] G.J. Pottie, W.J. Kaiser, Wireless integrated network sensors, Communications
of the ACM 43 (May) (2000) 5153.
[6] R.L. Moses, O.L. Moses, D. Krishnamurthy, R. Patterson, A self-localization
method for wireless sensor networks, EURASIP Journal on Applied Signal
Processing 4 (2002) 348358.
[7] J.C. Chen, K. Yao, R.E. Hudson, Source localization and beamforming, IEEE
Signal Processing Magazine (March) (2002) 3039.
[8] M. Wax, T. Kailath, Decentralized processing in sensor array, IEEE Transactions on Acoustics, Speech and Signal Processing 33 (October) (1985)
11231129.
[10] T. Soderstr
om,
P. Stoica, Statistical analysis of decentralized MUSIC, Circuits,
Systems, and Signal Processing 11 (4) (1992) 443454.
[11] D. Estrin, M. Srivastav, Instrumenting the world with wireless sensor networks, in: International Conference on Acoustics, Speech, Signal Processing
(ICASSP), May 2001, pp. 20332036.
[12] S. Marano, V. Matta, P. Willett, Distributed estimation in large wireless sensor
networks via a locally optimum approach, IEEE Transactions on Signal
Process. 56 (February (2)) (2008) 748756.
[13] T. Panigrahi, G. Panda, B. Mulgrew, Novel distributed bearing estimation
technique using diffusion PSO algorithm, IET Wireless Sensor Systems, http://
dx.doi.org/10.1049/iet-wss.2011.0107, in press.
[14] A. Abbasi, M. Younis, A survey on clustering algorithms for wireless sensor
network, Computer Communication 30 (October) (2007) 28262841.
[15] A. Chowdhury, S. Das, Automatic shape independent clustering inspired by
ant dynamics, Swarm and Evolutionary Computation 3 (0) (2012) 3345.
[16] A. Hatamlou, S. Abdullah, H. Nezamabadi-pour, A combined approach for
clustering based on K-means and gravitational search algorithms, Swarm and
Evolutionary Computation, 6 (2012) 4752, http://dx.doi.org/10.1016/j.
swevo.2012.02.003.
[17] J. Senthilnath, S. Omkar, V. Mani, Clustering using rey algorithm: performance study, Swarm and Evolutionary Computation 1 (3) (2011) 164171.
[18] T. Niknam, B. Amiri, An efcient hybrid approach based on PSO, ACO and
k-means for cluster analysis, Applied Soft Computing 10 (1) (2010) 183197.
[19] I. Ziskind, M. Wax, Maximum likelihood localization of multiple sources by
alternating projection, IEEE Transactions on Acoustics, Speech and Signal
Processing 36 (October (10)) (1988) 15531560.
57