Sie sind auf Seite 1von 4

Measurements of downlink power level distributions in LTE

networks
D. Colombi
1
B. Thors
1
N. Wirn
2
L.-E. Larsson
2
C. Trnevik
1




1
Ericsson Research, SE-164 80 Stockholm, Sweden,
e-mail: {davide.colombi,bjorn.thors,christer.tornevik}@ericsson.com.
2
TeliaSonera, SE-123 86 Farsta, Sweden
e-mail: 13{niklas.wiren,lars-eric.larsson}@teliasonera.com.
Abstract Human exposure to the radio frequency
electromagnetic fields (EMF) emitted by radio base stations
(RBS) is proportional to the transmitted power. Thus,
knowledge of actual power levels is key for a correct estimate
of the actual EMF exposure. In this study, downlink output
power distributions for RBSs in a 4G LTE mobile
communication network have been determined. By extracting
data using the operations support system (OSS) of the
network, statistics were obtained based on 24 hours
measurements for more than 5000 RBSs. The network
measurement approach was verified with in-situ power density
measurements. It was found that the actual output power
levels are significantly below the theoretical maximum. For
high-traffic periods, the 90
th
percentile transmitted power was
found to be about 12% of the theoretical maximum.
1 INTRODUCTION
Electromagnetic field (EMF) compliance assessments
of radio base stations (RBSs) are normally conducted
both before placing the products on the market and
putting them into operation on site. The assessments
ensure that the tested equipment and installations
comply with globally recognized exposure limits,
such as those published by the International
Commission on Non-Ionizing Radiation Protection
[1].
The EMF exposure levels for RBSs are
proportional to the transmitted power. In order to
assess compliance of RBS products, relevant
standards and recommendations [2], [3] specify that
measurements shall be conducted when the RBS is
transmitting at its maximum power or using
techniques allowing extrapolation to maximum
power conditions. In reality, however, the output
power varies over time with the amount of conveyed
traffic, and it can be regulated by power control
mechanisms depending on the instantaneous radio
channel conditions. As a result of this, the actual
maximum emitted power may be well below the
theoretical maximum. Furthermore, at mobile
communication frequencies, it is the transmitted
power averaged over six minutes which is of
interest from an RF exposure compliance point of
view rather than the peak power [1].
In [4]], it was found that the average transmitted
power for 2G (GSM) and 3G (WCDMA) mobile
communication networks may be considerably below
the theoretical maximum. To be more specific, the
90th percentile averaged output power was found to
be 65% and 45% of the maximum for 2G and 3G,
respectively. Similar results have also been published
in [5] and [6]. As a consequence, RF exposure
assessments of 2G and 3G RBSs conducted for
maximum transmit power conditions might produce
unrealistic results.
To the best of our knowledge, there are no similar
studies currently available in the literature on
downlink power levels for the Long Term Evolution
(LTE) radio access technology. The first commercial
LTE networks for mobile broadband were launched
in 2009. Since then, more than one hundred networks
have been deployed throughout the world to satisfy
an increased demand for higher data rates. LTE has
been recognized as a 4G radio access technology by
ITU [7], and innovative changes have been
introduced compared with previous 2G and 3G
technologies to enhance performance and increase
spectrum efficiency.
In this work, LTE downlink output power
distributions have been assessed based on data
obtained for more than 5000 TeliaSonera RBS cells
in Sweden. To allow a simultaneous collection of
data for these sites, network based measurements
were conducted via the operations support system
(OSS), normally used by operators to monitor,
control and analyze the performance of their
networks. With this approach, statistics can be
generated, based on an extensive number of samples,
reflecting whole regions and a multitude of
environments.
The next section provides a description of the
method used and some details of the selected
network. Results are presented in Section 3 and
discussed in Section 4. Finally, conclusions are
presented in Section 5.
978-1-4673-5707-4/13/$31.00 2013 IEEE


98

2 METHOD
2.1 Network measurements
Network performance management is the base for
optimization, supervision and troubleshooting of
radio communication networks. Software and
solutions are provided by the OSS for processing
recorded network data and generating statistics. In
this work, the Performance Statistics application in
the Ericsson Operation and Maintenance system has
been used. For each LTE cell, events such as
connection setups, handovers and capacity
utilization, are recorded and can be accumulated over
time using specific counters. Since these counters are
measured directly at the network side, they are
subject to only small uncertainties.
Here, the objective was to make use of suitable
counters to measure the downlink output power
normalized to the available maximum (P/P
max
). A
dedicated counter enabling direct measurements of
the downlink power levels was not available. As a
consequence, a model was needed by which
measurable quantities could be converted into power
levels. A simple model was obtained from the
observation that the amount of power transmitted in
the downlink for LTE is directly proportional to the
allocated bandwidth [8]. In contrast to 2G and 3G
technologies, LTE does not make use of dynamic
power control to compensate for variation in the
channel conditions and the amount of transmitted
power per bandwidth (watt per MHz) is kept
constant. The power level was determined from a
counter recording the downlink physical resource
block (RB) utilization.
The data from the OSS were obtained as empirical
probability density functions (PDFs) of the RB
utilization in every 15 minutes period. Values were
sorted into bins with a width of 10% of the maximum
RB utilization. From the reported PDFs, averaged
output power values, corresponding to each 15
minute period, were calculated assuming RB
utilization equal to the central value of each bin.
Since network data were reported for 15 minute
periods, the averaging time is 2.5 times longer than
the averaging time normally stipulated for RF EMF
exposure assessments [1]. As will be shown in
Section 4, however, the difference between six and
fifteen minutes averaging, in terms of resulting
downlink power distributions, is small.
The data collected correspond to about 500000
samples of average power gathered for more than
5000 cells in Sweden during a weekday in May, 2013
(24 hour measurements). The samples include cells
operating in both LTE band 7 (2620 2690 MHz)
and LTE band 20 (791 821 MHz), covering urban
and rural areas. The channel bandwidths for the
investigated networks were 20 MHz and 10 MHz for
band 7 and band 20, respectively.

2.2 In-situ measurements
Normalized output power levels obtained from the
network based measurements were compared and
verified with in-situ measurements for one LTE cell
in Stockholm. The power density integrated over the
LTE channel bandwidth was measured and recorded
every 12 seconds during more than 7 hours at a point
in the vicinity of the RBS antenna using a Narda
SRM 3006 frequency selective radiation meter
(Narda Safety Test Solutions, Hauppauge, NY,
USA). Simultaneously, instantaneous measurements
of the cell reference signal were conducted and
extrapolated to the theoretical maximum power
density that would be obtained if the RBS was
transmitting at full power [9]. The ratio between the
measured instantaneous power density and the
corresponding extrapolated maximum should, when
time-averaged over 15 minutes, ideally equal the
normalized RBS output power obtained by means of
network measurements. To produce a meaningful
comparison, the network was loaded periodically by
downloading, in sequence and with a certain time
interval, a large file using an LTE modem connected
to the selected cell.
3 RESULTS
The empirical cumulative distribution function (CDF)
for the overall network data is shown in Figure 1.
Results are also shown for periods of high and low
traffic ranging from 4 PM to 8 PM and from 1 AM to
5 AM, respectively. The results show that the
obtained normalized RBS output power for LTE is
significantly below the theoretical maximum with a
median output power of about 6%.
A subset of the available site data was used to
compare statistics for urban and rural environments.
The results, based on data gathered from 10 cells in
each environment show that the normalized output
power is larger for the rural than for the urban
environments. A summary of some statistical metrics
for the obtained LTE results is provided in Table 1.








99

Figure 1: Empirical CDFs of normalized RBS output
power in the considered LTE network illustrating the
variation between low and high traffic hours.

Mean Median 90
th

percentile
All sites, 24 h

6.8 6.4 8.2
All sites, low
traffic hours

5.5 6.3 7.3
All sites, high
traffic hours

7.9 6.5 12
10 urban cells,
24 h

5.1 6.3 7.3
10 rural cells,
24 h
8.9 6.5 14
Table 1: Mean, median and 90
th
percentile of the
normalized downlink output power (in percent) for the
considered LTE network.

The results from the in-situ verification are shown
in Figure 2. The explanation for the rather periodic
peaks in the power transmitted by the RBS is the
employed traffic generation scheme mentioned in
Section 2.2. Despite the fact that the in-situ
measurements are affected by radio link fading and
instrument uncertainty (-3.8 dB +2.7 dB, k=2 [10]),
Figure 2 shows a strong correlation with the power
samples collected from the network.


Figure 2: Verifying in-situ measurements compared
with network based measurements of normalized
output power time-averaged over 15 minutes for one
LTE cell.
4 DISCUSSION
The LTE RBS aims to maximize the throughput
during transmission by using all available bandwidth,
i.e. all available RBs. No power control algorithm is
employed in downlink and the power per used RB is
constant. Therefore it is not unlikely that the RBS
will transmit at or close to maximum power during
short periods of time. By maximizing the data rate,
however, the transmission time is minimized
reducing the average power correspondingly. This is
confirmed in Figure 1 which shows that the 15
minutes averaged output power transmitted for an
LTE network is typically much lower than the
maximum value.
The mean output power obtained in the TeliaSonera
4G network considering the overall data set was less
than 7% of the available maximum. During high-
traffic hours and for the subset with 10 selected rural
cells, the corresponding values were about 8% and
9%, respectively.
For the selected urban sites, the downlink output
power was found to be lower compared with the
corresponding rural sites. This is most likely a
consequence of the larger rural cell coverage area
providing service to a larger number of users.
The 90
th
percentile of the normalized transmitted
power determined in [4] for a 3G network in Sweden
was 35% which is larger than what was obtained here
for LTE. Although the considered network has been
operating commercially since 2009, the current 4G
LTE penetration is lower than for 2G and 3G. The
amount of conveyed traffic, and thereby the average
downlink power, might increase in the future.
As mentioned previously, at mobile communication
frequencies ICNIRP [1] specifies an averaging time
for exposure assessments of 6 minutes, whereas the
network counters produced results averaged over 15
minutes. The impact of the longer averaging time was
assessed by conducting a 24 hour in-situ


100
measurement of the power density from an arbitrarily
selected LTE site and CDFs were determined with
data averaged over 6 and 15 minutes. The results are
shown in Figure 3 which indicates that very similar
statistics are obtained for the two investigated
averaging times.


Figure 3: Comparison of empirical CDFs for different
averaging times.

5 CONCLUSIONS
RBS output power distributions have been
determined and analyzed for an LTE network in
Sweden. By means of network based measurements,
it was possible to gather data for more than 5000
RBS during 24 hours. The downlink output power
was found to be significantly below the maximum
available power. Of the about 500 000 time-averaged
power samples collected, no sample was found in the
bin corresponding to a time-averaged normalized
output power larger than 90%. The 90
th
percentile of
the transmitted power during high traffic hours was
found to be 12% of the maximum.
The results show that EMF compliance assessments
of RBSs, typically conducted for maximum
transmitted power configurations, might not be
representative of realistic exposure conditions. This
is especially the case for site compliance assessments
of shared sites, if it is required that all RF sources
and technologies present, simultaneously and for
several minutes, transmit at maximum power.
Since LTE is a relatively new technology, and the
amount of conveyed traffic is expected to increase in
the future, it could be interesting to repeat this study
in a few years when the LTE penetration is higher.


References
[1] ICNIRP, Guidelines for limiting exposure to
time-varying electric, magnetic, and
electromagnetic fields (up to 300 GHz), Health
Physics, vol. 74, pp. 494522, 1998.
[2] CENELEC EN 50400:2006, Basic standard to
demonstrate the compliance of fixed equipment
for radio transmission (110 MHz - 40 GHz)
intended for use in wireless telecommunication
networks with the basic restrictions or the
reference levels related to general public exposure
to radio frequency electromagnetic fields, when
put into service, 2006.
[3] ITU-T K61:2008, Guidance on measurement and
numerical prediction of electromagnetic fields for
compliance with human exposure limits for
telecommunication installations.
[4] D. Colombi, B. Thors, T. Persson, N. Wirn, L.-
E. Larsson, and C. Trnevik, Output power
distributions of mobile radio base stations based
on network measurements, in IOP Conf. Series:
Materials Science and Engineering 44, 2013.
[5] W. Joseph, L. Verloock, E. Tanghe, and
L. Martens, In-situ measurement procedures for
temporal RF electromagnetic field exposure of the
general public, Health Phys., vol. 96, pp. 529
542, 2009.
[6] Z. Mahfouz, A. Gati, D. Lautru, M. F. Wong,
J. Wiart, and V. F. Hanna, Influence of traffic
variations on exposure to wireless signals in
realistic environments, Bioelecromagnetics,
vol. 33, pp. 288297, 2011.
[7] ITU, ITU world radiocommunication seminar
highlights future communication technologies.
Press release, December 2010.
[8] E. Dahlman, S. Parkvall, J. Skld, and P. Beming,
3G Evolution: HSPA and LTE for Mobile
Broadband; second edition. Academic Press,
2008.
[9] L. Verloock, W. Joseph, A. Gati, N. Varsier,
B. Flach, J. Wiart, and L. Martens, Low-cost
extrapolation method for maximal LTE radio base
station exposure estimation: test and validation,
Radiat Prot Dosimetry, Available online, 2012.
[10] Narda Safety Test Solutions, SRM-3006
Selective Radiation Meter Data Sheet. [Online]
Available: http://www.narda-
sts.us/pdf_files/DataSheets/SRM3006_DataSheet.
pdf, [Accessed: 27 May 2013].





101

Das könnte Ihnen auch gefallen