Beruflich Dokumente
Kultur Dokumente
Date: 11/11/2013
RESULTS
OF
PERFORMANCE COMPARISON
BETWEEN
M I M O A N D O R D I N A R Y 8 0 2. 1 1 D E V I C E S
Draft 0.2
Page: 2 of 20
Date: 11/11/2013
CONTENTS
1. DOCUMENT CONTROL.................................................................................................................................3
2. INTRODUCTION..............................................................................................................................................3
3. PURPOSE ..........................................................................................................................................................4
4. INTENDED AUDIENCE ................................................................................................................................4
5.
6. SUMMARY OF FINDINGS.............................................................................................................................4
7. TECHNICAL ANALYSIS ...............................................................................................................................5
7.1 THE METHODOLOGY................................................................................................................................ 5
7.2 TEST ENVIRONMENT............................................................................................................................... 5
7.3 TEST SETUP............................................................................................................................................ 5
7.4 TESTS..................................................................................................................................................... 7
7.5 TEST RESULTS........................................................................................................................................ 9
7.5.1 Performance Benchmark Test Results...................................................................................................9
7.5.2 Static Test Results...................................................................................................................................9
7.5.3 Mobility Test Results............................................................................................................................10
7.6 PERFORMANCE COMPARISONS FOR STATIC TESTS..................................................................................15
7.6 PERFORMANCE ANALYSIS OF INDIVIDUAL DEVICE COMBINATIONS...........................................................17
8. APPENDIX A .................................................................................................................................................19
Draft 0.2
Page: 3 of 20
Date: 11/11/2013
1. DOCUMENT CONTROL
DOCUMENT NAME
ABSTRACT
VERSION HISTORY
VERSION
Draft 0.1
Draft 0.2
DATE
Dec 10,
2006
Dec 13,
2006
PREPARED BY
Raman Dhillon
Raman Dhillon
2. INTRODUCTION
Draft 0.2
Page: 4 of 20
Date: 11/11/2013
3. PURPOSE
The purpose of this document is to present the findings of a set of wireless performance
assessment experiments carried out by Pecolab in CU Boulder.
4. INTENDED AUDIENCE
The intended audience of this document is the researchers at Pecolab.
5.
MIMO
AP
RSSI
6. SUMMARY OF FINDINGS
The following are the key observations in the wireless performance experiments:
Page: 5 of 20
Date: 11/11/2013
of throughput and range of signal reception. However the MIMO performance is not
several orders of magnitude better than ordinary devices as is claimed.
Use of MIMO wireless card seems to be the real performance enhancer of a wireless
link among all the products that were tested. MIMO card provides an identical
performance with both MIMO and ordinary Access Points (MIMO is slightly better).
MIMO Access Point gives poor performance with an ordinary wireless adaptor. The
signal coverage is as bad as a combination of ordinary AP and ordinary card.
7. TECHNICAL ANALYSIS
7.1 The methodology
The methodology of the performance comparison of MIMO and ordinary 802.11 products is
based on taking measurements for a variety of performance parameters using certain software
tools. The measurements are taken in a real-life scenario to simulate the actual user experience
while using these wireless products.
7.2 Test Environment
All tests were conducted in real-life indoor environment. The measurements were taken from a
number of locations inside and close (outside) to the Pecolab (Pervasive Communications Lab)
on Floor 1 of the Engineering Centre. The lab contains several physical obstacles, metallic
objects and other interference sources.
Further the tests were conducted at several locations with the wireless client being at an
increasing distance from the AP for each new location. The objective of this approach is to
observe the quality of signal reception and throughput achieved with increasing distances and
physical obstacles in the signal path. The distance here however is an approximation of the
free space distance between the Access Point and the wireless client station. This distance
between the AP and the wireless client need not be a linear distance as there are several
concrete walls and other objects along the way.
The test environment is a typical worst case environment in a real life scenario. Users in a
home network are likely to experience better performance due to less obstructions and
interfering sources than an Electrical Engineering Communications Lab.
7.3 Test Setup
7.3.1 Setup
The test setup is as shown in Figure 1 below. The setup consists of an infrastructure mode
wireless LAN consisting of only one Access Point operating on 802.11g and a number of
wireless stations. There is however one computer node that is connected to the Access Point
through 100 Mbps Ethernet link.
Draft 0.2
Page: 6 of 20
Date: 11/11/2013
Page: 7 of 20
Date: 11/11/2013
Chipset: Atheros
7.3.6 Wireless Station
The wireless stations used in the tests were IBM laptops running Linux OS (Kernel 2.6) and
having PC Card interfaces for adding external Wireless adaptors. Typically two wireless
stations were used for a given test: one station was part of the wireless LAN network while
the other station acted as the wireless sniffer with its wireless interface operating in the
monitor mode.
7.3.7 Software
The software tools used for measuring the various performance parameters involved in the
tests were:
-
netperf
netperf was used for measuring throughput on the basis of a TCP stream test. netperf
is a benchmark network performance measurement tool.
iwconfig
iwconfig was used for determining RSSI (signal strength) and data rate currently in use
on the wireless interface of the client station. Iwconfig is a Linux / Unix based utility
for configuring wireless interfaces.
Ethereal
Ethereal was used for capturing frames on a station carrying out wireless sniffing.
Ethereal is open source software for capturing and inspecting all sorts of packets in a
wired or wireless network.
7.4 Tests
The following sections explain the types of tests that were conducted, the parameters that
were measured and the various device combinations that were employed in conducting the
experiments.
7.4.1 Wired throughput benchmark tests
These tests benchmark the network performance that can be achieved between two nodes
connected through wired 100 Mbps Ethernet interfaces available on the Access Point (to be
tested for wireless performance). The results obtained in these tests give the maximum
throughput value that can be achieved between the two nodes when one of the nodes is a
wireless node. This measures the internal switching speed of the access point and its interface
Draft 0.2
Page: 8 of 20
Date: 11/11/2013
efficiency.
7.4.2 Static wireless client tests
Static tests determine the various performance parameters for wireless communication
between an Access Point and a static wireless client with the client being at a specified
distance from the AP. A number of locations are chosen at increasing distance from the AP for
measuring the performance parameters as discussed in previous sections.
7.4.3 Mobile wireless client tests
Mobility tests measure the performance of wireless communication when the client station
moves inside a wireless environment. Here two types of tests are possible: throughput
achieved by a client that moves away from the AP and the client that moves towards the AP.
The speed of mobility here is of the order of a user walking at a gentle pace while using a
wireless device (this speed is several times less than vehicular speed).
7.4.4 Performance parameters measured
The performance parameters measured in a given test are as follows:
1. Throughput: throughput of the wireless link between the client and the AP in Mbps
2. Signal Strength: RSSI at the receiver interface of the wireless adaptor
3. Packet Error Rate: error rate observed in terms of retransmissions of 802.11 frames
4. Data rate: bit rate observed on the wireless interface of a wireless client station
7.4.5 Device combinations
The following combinations of access points and wireless adaptors were employed:
1. Ordinary AP Ordinary wireless card
2. Ordinary AP MIMO wireless card
3. MIMO AP MIMO wireless adaptor
4. MIMO AP Ordinary wireless adaptor
7.4.6 Test description
A typical test looks like this:
A netperf client (on a wireless station) initiates a TCP connection with a netperf server (on a
node having a wired connection with AP) and transmits a TCP packet stream. The typical size
of a TCP packet sent from the client to the server is 1500 bytes while the servers response to
the client is a smaller sized 100 bytes packet. Each test run of netperf takes 10 seconds
(default value) to complete.
A script is used to run the netperf tool for a user-defined number of iterations. The script
combines results obtained from netperf (throughput) and iwconfig (RSSI, data rate) and
Draft 0.2
By: Raman Dhillon
Page: 9 of 20
Date: 11/11/2013
displays them on-screen apart from logging these results into a text file.
A separate wireless station operating in the monitor mode captures packets during the entire
test run carried out using netperf. These captures are used to determine the percentage of
frame retransmissions for the entire communication. All these results are then recorded for
further analysis.
7.5 Test Results
All tests were conducted by placing the AP & wireless adaptors in 802.11g mode of operation.
Further, channel 4 (2.427 GHz) was used for wireless communication and the wireless
adaptors were in the auto rate mode i.e. the wireless interfaces were free to adapt data rate
according to signal quality changes.
The following sections explain the results of various types of tests.
Throughput (Mbps) Signal Strength (dBm)Data Rate (Card, Mbps) Packet Error rate (%)
4.76 2.05
21.43 5.23
-43 4
-42 1
Draft 0.2
13.30 6.81
11.00 0.00
11.49 4.85
4.31 0.93
Page: 10 of 20
Date: 11/11/2013
5 m 10 m
10 m - 15 m
15 m - 20 m
20 m - 25 m
> 25 m
28.21 1.01
6.12 5.22
7.08 2.96
NA
NA
NA
-52 3
-70 3
-75 2
-86 1
NA
11.00 0.00
12.00 2.16
17.40 1.90
50.40 3.29
2.58 0.91
12.59 2.00
7.74 1.64
46.75 0.00
NA
Throughput (Mbps) Signal Strength (dBm)Data Rate (Card, Mbps) Packet Error rate (%)
<1m
1m5m
5 m - 10 m
10 m - 15 m
15 m - 20 m
20 m - 25 m
> 25 m
31.27 0.92
30.05 1.36
22.47 11.09
16.83 1.31
4.82 2.22
3.99 1.46
2.78 0.78
-23 4
-43 1
-55 4
-66 1
-75 1
-81 2
-82 1
45.00 9.23
36.00 0.00
8.10 5.60
32.00 10.56
11.75 4.52
9.80 2.48
11.00 0.00
1.13 0.45
1.22 0.09
2.66 3.61
4.43 1.12
13.36 3.80
14.84 2.14
13.40 1.54
Throughput (Mbps) Signal Strength (dBm)Data Rate (Card, Mbps) Packet Error rate (%)
<1m
1m5m
5 m - 10 m
10 m - 15 m
15 m - 20 m
20 m - 25 m
> 25 m
23.15 4.00
30.71 2.29
27.90 3.08
22.01 2.64
5.70 1.60
0.73 0.58
0.29 0.22
-22 3
-46 2
-57 1
-70 1
-80 1
-89 3
-92 1
31.80 10.13
36.00 0.00
37.20 3.79
49.80 7.51
12.67 6.45
14.93 11.60
11.70 6.67
6.31 0.69
1.27 0.06
3.96 0.36
6.86 0.58
7.55 0.56
45.37 4.78
55.81 2.62
Throughput (Mbps) Signal Strength (dBm)Data Rate (Card, Mbps) Packet Error rate (%)
6.58 3.10
25.97 6.70
28.09 2.50
6.65 2.67
0.49 0.52
NA
NA
NA
NA
-41 2
-49 1
-59 4
-75 5
-83 4
NA
NA
8.85 11.12
5.00 0.00
5.00 0.00
17.00 13.00
19.26 9.07
12.99 3.96
2.12 0.28
4.04 0.41
14.18 4.95
34.52 11.46
NA
NA
Page: 11 of 20
Date: 11/11/2013
(or signal strength) is reached. Once within this threshold range the MIMO card delivers good
performance consistently.
An ordinary card, in an away test, starts with low throughputs when very close to the AP.
As the wireless station moves away from the AP the performance peaks and remains
consistent for a certain distance. However, the performance degrades quickly subsequent to
the crossing of a certain threshold distance.
The ordinary card in a towards test loses on throughput consistently due to motion towards
the AP. However, after a certain threshold range it picks up performance and delivers
consistent throughput within that peak range.
There are two noticeable differences between an ordinary and MIMO card performance as
described below:
1. Ordinary card seems to deliver better performance when moving towards the AP in
comparison to the MIMO card where the test tool stalls altogether (for a certain
distance).
2. MIMO cards throughput recovery is better than an ordinary card once it reaches a
threshold range while moving towards the AP.
The following sections depict plots for Away from AP and Towards the AP mobility tests
for all combinations of APs and wireless adaptors. However the plots are for a typical single
test run and not a generalization of an observed behaviour over a number of test runs.
Parameter
50
40
Throughput (Mbps)
30
20
10
0
1
10
11
12
13
14
Aw ay from AP -->
Figure 2
Draft 0.2
Page: 12 of 20
Date: 11/11/2013
Ordinary AP - Ordinary Card Towards Test
30
Parameter
25
20
Throughput (Mbps)
15
10
5
0
1
10
11
12
13
14
15
Figure 3
Parameter
50
40
Throughput (Mbps)
30
20
10
0
1
9 10 11 12 13 14 15 16 17 18 19 20
Aw ay from AP -->
Figure 4
Draft 0.2
Page: 13 of 20
Date: 11/11/2013
Ordinary AP - MIMO Card Towards Test
35
30
Parameter
25
20
Throughput (Mbps)
15
10
5
0
1
10
11
12
13
14
Figure 5
Parameter
25
20
Throughput (Mbps)
15
10
5
0
1
9 10 11 12 13 14 15 16 17 18 19 20
Aw ay from AP -->
Figure 6
Draft 0.2
Page: 14 of 20
Date: 11/11/2013
MIMO AP - MIMO Card Towards Test
40
35
Parameter
30
25
Throughput (Mbps)
20
15
10
5
0
1
9 10 11 12 13 14 15 16 17 18 19 20
Figure 7
Parameter
50
40
Throughput (Mbps)
30
20
10
0
1
10
Aw ay from AP -->
Figure 8
Draft 0.2
Page: 15 of 20
Date: 11/11/2013
MIMO AP - Ordinary Card Towards Test
60
Parameter
50
40
Throughput (Mbps)
30
20
10
0
1
10
Figure 9
Throughput (Mbps)
30
25
20
15
10
5
0
0
10
15
20
25
30
35
Distance (m )
Draft 0.2
Page: 16 of 20
Date: 11/11/2013
Throughput
Throughput (Mbps)
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
<1m
1 m - 5 m 5 m - 10 10 m - 15 15 m - 20 20 m - 25 > 25 m
m
m
m
m
Distance (m )
Figure 11: Plot of averaged values (the error bars have been removed for clarity)
As can be seen in the figures above, MIMO devices enable better throughput than ordinary
802.11 devices. However, the MIMO card is the device that gives a somewhat identical
performance with both MIMO and ordinary APs. The plot with averaged throughputs (figure
11) shows curves that are typical to the wireless adaptor in use (on the client station) rather
than the type of Access Point in use. The error bars in figure 3 have been removed for the sake
of clarity, the variations can be observed in the statistics presented in section 7.5.2.
7.6.2 Comparison of Received Signal Strength
The following figures (12 and 13) illustrate the plot of received signal strength on a client
stations wireless interface for various combinations of MIMO and ordinary 802.11 devices.
The statistics used in the plots are the results of static tests as described in section 7.4.2.
RSSI vs Distance
0
-10 0
10
15
20
25
30
35
-20
RSSI (dBm)
-30
-40
-50
-60
-70
-80
-90
-100
Distance (m )
Draft 0.2
Page: 17 of 20
Date: 11/11/2013
Signal Strength (RSSI)
0
-10
<1m
RSSI (dBm)
-20
1 m - 5 m 5 m - 10 10 m - 15 15 m - 20 20 m - 25 > 25 m
m
m
m
m
-30
-40
-50
-60
-70
-80
-90
-100
Distance (m )
An ordinary 802.11g card working with a MIMO AP loses network connection after 20m of
unobstructed space in the test environment. Similarly, the same ordinary 802.11g card loses
network connectivity after 25m when the access point is an ordinary 802.11g AP.
The MIMO Card does better with both ordinary as well as MIMO Access Points in terms of
signal strength (RSSI) received on the wireless interface.
7.6 Performance Analysis of individual device combinations
The following sections illustrate the changes in throughput, data rate and packet error rate for
various combinations of Access Points and wireless adaptor as observed in the static tests. The
values of packet error rates (%) have been scaled up to show clearly the relationship between
the throughput and error rates. All values have been averaged over the complete sample set
and error bars have been omitted to avoid clutter.
7.6.1 Ordinary AP Ordinary Card
Parameter
50.00
40.00
Throughput (Mbps)
30.00
20.00
10.00
0.00
<1m
1 m - 5 m 5 m - 10 m 10 m - 15
m
15 m - 20
m
20 m - 25
m
Distance (m )
Draft 0.2
Page: 18 of 20
Date: 11/11/2013
Figure 14: Comparison of various performance parameters
Parameter
Throughput (Mbps)
Data Rate (Mbps)
Packet Error Rate (%)
<1m
1 m - 5 m 5 m - 10
m
10 m 15 m
15 m 20 m
20 m 25 m
> 25 m
Distance (m )
Parameter
50.00
40.00
Throughput (Mbps)
30.00
20.00
10.00
0.00
<1m
1 m - 5 m 5 m - 10 10 m - 15 15 m - 20 20 m - 25 > 25 m
m
m
m
m
Distance (m )
Draft 0.2
Page: 19 of 20
Date: 11/11/2013
MIMO AP - Ordinary Card
40.00
35.00
Parameter
30.00
25.00
Throughput (Mbps)
20.00
15.00
10.00
5.00
0.00
<1m
1 m- 5 m
5 m - 10 m
10 m - 15 m
15 m - 20 m
Distance (m )
8. APPENDIX A
MIMO AP Behavior
An interesting observation was made in context of RangeMAX (MIMO) AP operation.
A typical sequence of frames observed periodically in the network (with no wireless client
station being part of the network) is as follows:
AP
Broadcast
|
beacon
|
| ---------------------->
|
|
|
|
CTS
|
| <---------------------------|------------ (None)
|
|
|
CF - End
|
| ---------------------->
|
The Beacon frame's Capability Information field contains CF-Pollable and CF-Poll Request
bits both set to 0 (zero). This combination indicates that there is no point coordinator (PC) at
Draft 0.2
By: Raman Dhillon
Page: 20 of 20
Date: 11/11/2013
Draft 0.2