W
i
,= W
i
]. The capacity of the K
CIFCCMS channel consists of all rates (R
1
, . . . , R
K
) such
that P
(N)
e
0 as N .
A. The Gaussian Noise Channel
The singleantenna complexvalued KCIFCCMS with Ad
ditive White Gaussian Noise (AWGN), shown in Fig. 1 for the
case K = 3, has inputoutput relationship
Y
i[1:K]
h
i
X
i
+Z
, [1 : K], (1)
where, without loss of generality, the inputs are subject to the
power constraint E[[X
i
[
2
] 1, i [1 : K], and the noises
are marginally (the joint distribution does not matter as the
receivers cannot cooperate) propercomplex Gaussian random
variables with parameters Z
i[1:K]
S
mn
i
X
i
, [1 : K], (2)
where m := max
(i,j)[1:K]
2n
ij
, S is the binary shift matrix
of dimension m, X
i
and Y
i
are binary vectors of length m,
and the channel gains n
i
, (, i) [1 : K]
2
, are nonnegative
integers. The channel in (2) can be thought of as the high
SNR approximation of the channel in (1) with their parameters
related as n
ij
= log(1 +[h
ij
[
2
), (i, j) [1 : K]
2
.
III. OUTERBOUND FOR THE KUSER CIFC WITH CMS
We obtain a general and computable outer bound next. The
sumcapacity bound is obtained by giving S
i
as side informa
tion to receiver i, i [1 : K], where S
i
:= [S
i1
, W
i1
, Y
N
i1
]
starting with S
1
:= . With this nested side information,
the mutual information terms can be expressed in terms of
entropies which may be recombined in ways that can be easily
singleletterized. This form of the side information allows us
to extend the result from the 3user case [6] to any number of
users. Other partial sumrate bounds are obtained with similar
side information structure.
Theorem 1. The capacity region of a general memoryless K
CIFCCMS is contained in the region dened by i [1 : K]
R
i
I(Y
i
; X
[i:K]
[X
[1:i1]
), (3a)
K
j=i
R
j
K
j=i
I(Y
j
; X
[j:K]
[X
[1:j1]
, Y
[1:j1]
), i [1 : K]
(3b)
for some joint input distribution P
X1,...,X
K
, where X
S
:=
X
i
: i o for some index set o [1 : K]. Moreover,
each rate bound in (3b) may be tightened with respect to the
channel joint conditional distribution as long as the channel
conditional marginal distributions are preserved.
Proof: The the individual rate bounds in (3a) are trivial
cutset bounds, we therefore present the proof of (3b):
N
K
j=i
(R
j
N
)
K
j=i
I(Y
N
j
; W
j
) Fanos inequality
j=i
I(Y
N
j
, W
[1:i1]
, W
[i:j1]
, Y
N
[i:j1]
; W
j
)
=
K
j=i
I(Y
N
[i:j]
; W
j
[W
[1:j1]
) independence of messages
=
K
j=i
j
k=i
I(Y
N
k
; W
j
[W
[1:j1]
, Y
N
[i:k1]
) chain rule
=
K
k=i
K
j=k
I(Y
N
k
; W
j
[W
[1:j1]
, Y
N
[i:k1]
)
=
K
k=i
I(Y
N
k
; W
[k:K]
[W
[1:i1]
, W
[i:k1]
, Y
N
[i:k1]
)
k=i
N
t=1
I(Y
k,t
; X
[k:K],t
[X
[1:k1],t
, Y
[i:k1],t
).
Finally, by introducing a timesharing random variable and by
conditioning reduces entropy, we arrive at (3b).
2013 IEEE International Symposium on Information Theory
2035
IV. SUMCAPACITY OF THE LDC KCIFCCMS
For the LDC, we show a sumcapacity achieving scheme
that only requires cognition of all messages at one trans
mitter. For simplicity we consider the symmetric case only.
In a symmetric LDC all direct links have the same strength
n
ii
= n
d
0, i [1 : K], and all the cross links have the
same strength n
i
= n
c
= n
d
0, (, i) [1 : K]
2
, ,= i.
Theorem 2. The sumcapacity bound in (3b) for i = 1 is
achievable for the symmetric LDC KCIFCCMS.
Proof: For the symmetric LDC KCIFCCMS the sum
capacity in was shown to be upper bounded by [6], [12]:
K
k=1
R
k
n
d
_
K max1, for ,= 1
1 for = 1
. (4)
The discontinuity at = 1 in (4) follows from the fact that
when n
d
= n
c
the channel reduces to a Kuser MAC with
sumcapacity given by max H(Y
1
) = n
d
.
To show the achievability of (4), let U
j
, j [1 : K], be a
vector composed of i.i.d. Bernoulli(1/2) bits. Let the transmit
signals be
X
j
= U
j
, j [1 : K 1],
X
K
=
_
I
nc
0
nc[n
d
nc]
+
0
[n
d
nc]
+
nc
0
[n
d
nc]
+
[n
d
nc]
+
_
_
_
K1
j=1
U
j
_
_
+
_
0
ncnc
0
nc[n
d
nc]
+
0
[n
d
nc]
+
nc
I
[n
d
nc]
+
_
U
K
,
where 0
nm
indicates the all zero matrix of dimension nm,
I
n
the identity matrix of dimension n, and [x]
+
:= maxx, 0.
Recall that operations are on GF(2). With these choices,
it may be shown that Y
= (S
mn
d
+ S
mnc
)X
. Then,
since the matrix S
mn
d
+ S
mnc
is full rank for n
d
,= n
c
,
receiver , [1 : K], decodes U
from (S
mn
d
+
S
mnc
)
1
Y
= X
= U
u=1
R
u
K
u=1
I
_
X
u
, , X
K
; Y
u
X
1
, Y
1
, , X
u1
, Y
u1
_
= I
_
X
1
, , X
K
; [h
d
[X
1
+h
i
K
i=2
X
i
+Z
1
_
+
K1
u=2
I
_
X
u
, , X
K
; [h
d
[X
u
+h
i
K
i=u+1
X
i
+Z
u
, h
i
K
i=u
X
i
+Z
, [1 : u 1]
_
+I
_
X
K
; [h
d
[X
K
+Z
K
, h
i
X
K
+Z
, [1 : K 1]
_
h
_
[h
d
[X
1
+h
i
K
i=2
X
i
+Z
1
_
h(Z
1
)
+
K1
u=2
h
_
[[h
d
[ h
i
]X
u
+Z
u
Z
u1
) h(Z
u
)
+h
_
[h
d
[X
K
+Z
K
h
i
X
K
+
1
K 1
K1
=1
Z
_
h(Z
K
).
Finally, by the Gaussian maximizes entropy principle:
K
k=1
R
k
log
_
1 +
_
[h
d
[ + (K 1)[h
i
[
_
2
_
(5a)
+ (K 2) log(2) + (K 2) log
_
1 +
[h
d
[ h
i
2
2
_
(5b)
+ log
_
1 +
[h
d
[
2
1 + (K 1)[h
i
[
2
_
. (5c)
For h
i
= [h
d
[ all received signals are statistically equivalent,
therefore the KCIFCCMS is equivalent to a Kuser MAC
with correlated inputs, whose sumcapacity is bounded by
K
k=1
R
k
log(1 +K
2
[h
d
[
2
).
B. Achievable Scheme
In this section we describe a scheme which will be used
to show a constant gap to the symmetric upper bound de
rived in Section VA. Inspired by the capacity achieving
strategy for the Gaussian MIMOBC, we introduce a scheme
that uses Dirty Paper Coding (DPC) with encoding order
1 2 3 K. We denote by
the covariance
matrix corresponding to the message intended for decoder ,
2013 IEEE International Symposium on Information Theory
2036
[1 : K], as transmitted across the K antennas/transmitters.
The input covariance matrix is
Cov[X
1
, . . . , X
K
] =
K
=1
:
_
K
=1
_
k,k
1, k [1 : K],
(6a)
where the constraints on the diagonal elements correspond to
the input power constraints. Moreover, since message can
only be broadcasted by transmitters with index larger than ,
we further impose
_
k,k
= 0 for all 1 k < K. (6b)
The achievable rate region is then the set of nonnegative
rates (R
1
, . . . , R
K
) that satisfy, for h
:= [h
,1
h
,2
. . . h
,K
]
R
log
_
_
1 +
h
1 +h
K
k=+1
k
_
h
_
_
, (7)
for [1 : K] and all possible Cov[X
1
, . . . , X
K
] complying
with (6), with the convention that
K
k=K+1
k
= 0.
In particular we consider the transmit signals
X
1
=
1
U
1
,
X
j
=
j
U
j
+
j
U
(ZF)
j
+
j
U
1
, j [2 : K 1],
X
K
=
K
U
K
K
K1
j=2
U
(ZF)
j
+
K
U
1
,
where U
, U
(ZF)
j[1:K]
are such that
[
1
[
2
1,
[
j
[
2
+[
j
[
2
+[
j
[
2
1, j [2 : K 1],
[
K
[
2
+[
K
[
2
(K 2) +[
K
[
2
1,
in order to satisfy the power constraints. Notice the negative
sign for
K
, which we shall use to implement zeroforcing
of the aggregate interference
K1
j=2
U
(ZF)
j
. Moreover, all
transmitters cooperate in beam forming U
1
to receiver 1.
These two facts can be easily seen by observing that for
1
= . . . =
K
:=
K
=1
X
=
K
=1
,
1
:=
K
=1
.
With these choices the message covariance matrices are
1
= aa
, a := [
1
, . . . ,
K
]
T
,
j
= [
j
[
2
e
j
e
j
+[[
2
(e
j
e
K
)(e
j
e
K
)
, j [2 : K],
where e
j
indicates a lengthK vector of all zeros except for
a one in position j, j [1 : K], indicates the Hermitian
transpose, and where =
1
= . . . =
K
. We next express
the channel vectors h
= ([h
d
[ h
i
) e
+h
i
_
K
k=1
e
k
_
, [1 : K].
By noticing that h
j
= [ j]([h
d
[ h
i
) +h
i
, [1 : K],
where [k] is the Kroneckers delta function, the following
rates (see [12]) are achievable
R
1
= log
_
_
_1 +
[h
d
[ +[h
i
[
K
j=2
2
1 +[h
i
[
2
K
k=2
[
k
[
2
_
_
_, (8a)
R
j
= log
_
_
_1 +
[h
d
[ h
i
2
[[
2
+[h
d
[
2
[
j
[
2
1 +[h
i
[
2
K
k=j+1
[
k
[
2
_
_
_, (8b)
R
K
= log
_
1 +[h
d
[
2
[
K
[
2
_
, (8c)
for j [2 : K 1] and with
1
= exp(jh
i
) (notice the
phase of
1
which allows coherent combining at receiver 1 of
the different signals carrying U
1
, i.e., all users beamform to
the primary receiver).
C. Constant Additive Gap
Theorem 3. The sumcapacity bound in (3b) is achievable
for the symmetric Gaussian KCIFCCMS to within 6 bits
per channel use for K = 3 and to within (K 2) log
2
(K
2) + 3.88 bits per channel use for K 4.
Proof: We now choose the parameters in (8) so as to
match the upper bound in (5). We recall that user K is the
most cognitive user and can therefore precode the whole
interference seen at its receiver by using DPC; by doing
so, receiver K would not have anything to treat as noise
besides the Gaussian noise itself. We therefore interpret the
term
1
1+(K1)hi
2
1 in (5c) as the fraction of power
transmitter K dedicates to its own signal. This is as setting
[
K
[
2
=
1
1 + (K 1)[h
i
[
2
in (8c). This choice guarantees that (8c) exactly matches (5c).
Next we match the upper bound term in (5b) to the achievable
rates in (8b) by setting
j
= 0, j [2, K 1],
1
2
=
[[
2
1 +[h
i
[
2
[
K
[
2
.
However, from the power constraint for user K, [[
2
1
K

2
K2
, which imposes the following condition
K 4
K 2
+
_
[h
i
[
2
+
2
K 2
_
[
K
[
2
0.
The above condition cannot be satised for K 4; for K = 3
it requires that
[
3
[
2
=
1
1 + 2[h
i
[
2
1
[h
i
[
2
+ 2
,
which can be satised by [h
i
[
2
1. Therefore, in the following
we shall assume [h
i
[
2
1 and set
j
= 0, j [2, K1] and
[[
2
=
_
_
_
1
K

2
K2
=
1
K2
_
1
1
1+(K1)hi
2
_
K 4
1+hi
2
3
2
2
=
1+3hi
2
2(1+2hi
2
)
K = 3
,
2013 IEEE International Symposium on Information Theory
2037
which implies
[
K
[
2
=
_
0 K 4
1 [[
2
[
K
[
2
=
1+hi
2
2(1+2hi
2
)
K = 3
.
Finally, for j [2 : K 1]
[
j
[
2
= 1[
j
[
2
=
_
K3
K2
+
1
K2
1
1+(K1)hi
2
K 4
1+hi
2
2(1+2hi
2
)
K = 3
.
The rates may then be bounded for K 4 (using [h
i
[
2
1):
R
K
= log
_
1 +
[h
d
[
2
1 + (K 1)[h
i
[
2
_
R
j
log
_
1 +
[h
d
[ h
i
2
K 2
K 1
K + 1
_
, j [2 : K 1],
R
1
log
_
1 +
[h
d
[ +[h
i
[
_
(K 3)(K 2)
2
2
_
,
and for K = 3
R
3
= log
_
1 +
[h
d
[
2
1 + 2[h
i
[
2
_
, R
2
= log
_
1 +
[h
d
[ h
i
2 1
2
_
R
1
log
_
1 +
[h
d
[ +[h
i
[
1
2
2
2
_
.
By taking the difference between the upper bound in (5)
and the derived achievablerates, we nd that the gap is upper
bounded by, for K 4:
GAP (K 2) log (K 2) + log(2 exp(2)),
(where we used K log
e
(1 + 1/K) 1), and for K 3
GAP log(2) + log
_
1 +
_
[h
d
[ + 2[h
i
[
_
2
_
log
_
1 +
[h
d
[ +[h
i
[
1
2
2
2
_
6 log(2).
For [h
i
[
2
< 1, set
j
=
j
= 0,
j
= 1 for j [2 : K] so that
K
=1
R
=
K
=1
log
_
1 +
[h
d
[
2
1 + (K )[h
i
[
2
_
.
The gap to the upper bound is at most
GAP (K 2) log(2) + 2 log(K 1) +
K1
=2
log
_
K
2
_
.
which is smaller than the gap for [h
i
[
2
1. Details on the
calculation of the gaps are found in [12].
D. Constant Multiplicative Gap
We next consider the sumcapacity to within a multiplicative
gap, more relevant at low SNR than additive gaps.
Theorem 4. The symmetric sumcapacity is achievable to
within a factor K by beamforming to the primary user.
Proof: The rate of user j in (3a) is upper bounded by
C
j
:= log(1 + ([h
d
[ + (K j)[h
i
[)
2
) C
1
, j [1 : K];
this implies that the sumrate is upper bounded by K C
1
.
Consider an achievability scheme in which all users beamform
to user 1: this achieves the sumrate R
1
+ R
K
= C
1
. This
is to within a factor K of the upper bound, proving Th. 4.
Interestingly, the achievable scheme only requires user K
to be cognitive of the messages of all other users, while the
rest of users only need to know the message of user 1 in
addition to their own message; this considerably relaxes the
CMS requirement.
VI. CONCLUSION
We presented a general outer bound region for the K
user cognitive interference channel with cumulative message
sharing. This computable outer bound was used to show
that the symmetric sumcapacity is exactly achievable for the
linear deterministic channel and to within constant gap for the
Gaussian noise channel. While the focus of this paper was on
cumulative message sharing, we note that, interestingly, the
same sumcapacity outer bound may be achieved with a dif
ferent message sharing structure with less message knowledge
at the cognitive transmitters.
ACKNOWLEDGMENT
The work of the authors was partially funded by NSF under
award 1017436. The contents of this article are solely the
responsibility of the authors and do not necessarily represent
the ofcial views of the NSF.
REFERENCES
[1] N. Devroye, P. Mitran, and V. Tarokh, Achievable rates in cognitive
radio channels, IEEE Trans. Inf. Theory, vol. 52, no. 5, pp. 18131827,
May 2006.
[2] S. Rini, D. Tuninetti, and N. Devroye, New inner and outer bounds for
the discrete memoryless cognitive interference channel and some new
capacity results, IEEE Trans. Inf. Theory, vol. 57, no. 7, pp. 40874109,
Jul. 2011.
[3] , On the capacity of the gaussian cognitive interference channel:
new inner and outer bounds and capacity to within 1 bit, IEEE Trans.
Inf. Theory, 2012.
[4] K. Nagananda and C. Murthy, Information theoretic results for three
user cognitive channels, in Proc. IEEE Global Telecommun. Conf.,
2009.
[5] A. Jafarian and S. Vishwanath, On the capacity of multiuser cognitive
radio networks, in Proc. IEEE Int. Symp. Inf. Theory. IEEE, 2009,
pp. 601605.
[6] D. Maamari, N. Devroye, and D. Tuninetti, The sumcapacity of the
linear deterministic threeuser cognitive interference channel, in Proc.
IEEE Int. Symp. Inf. Theory, Cambridge, MA, July 2012.
[7] K. G. Nagananda, P. Mohapatra, C. R. Murthy, and S. Kishore, Mul
tiuser cognitive radio networks: An information theoretic perspective,
http://arxiv.org/abs/1102.4126.
[8] K. Nagananda, C. Murthy, and S. Kishore, Achievable rates in three
user interference channels with one cognitive transmitter, in Int. Conf.
on Signal Proc. and Comm. (SPCOM), 2010.
[9] M. Mirmohseni, B. Akhbari, and M. Aref, Capacity bounds for the
threeuser cognitive zinterference channel, in 12th Canadian Workshop
on Information Theory (CWIT), 2011, pp. 3437.
[10] M. G. Kang and W. Choi, The capacity of a three user interference
channel with a cognitive transmitter in strong interference, in Proc.
IEEE Int. Symp. Inf. Theory, 2012.
[11] A. Avestimehr, S. Diggavi, and D. Tse, A deterministic approach to
wireless relay networks, Proc. Allerton Conf. Commun., Control and
Comp., Sep. 2007.
[12] D. Maamari, D. Tuninetti, and N. Devroye, Approximate sumcapacity
of kuser cognitive interference channels with cumulative message
sharing, http://arxiv.org/abs/1301.6198.
2013 IEEE International Symposium on Information Theory
2038
Viel mehr als nur Dokumente.
Entdecken, was Scribd alles zu bieten hat, inklusive Bücher und Hörbücher von großen Verlagen.
Jederzeit kündbar.