Sie sind auf Seite 1von 223

Optimization Procedures and Guidelines, Ver. 1.

CDMA RF NETWORK OPTIMIZATION GUIDEBOOK

NETWORK SOLUTIONS SECTOR

May 16, 2000


Version 1.2

MOTOROLA CONFIDENTIAL Copyright Motorola 1999 This document and the information contained herein is CONFIDENTIAL INFORMATION of Motorola, and shall not be used, published, disclosed, or disseminated outside of Motorola in whole or part without Motorolas consent. This document contains trade secrets of Motorola. Reverse engineering of any or all of the information in this document is prohibited. The copyright notice does not imply publication of this document.

Motorola Confidential Proprietary

Optimization Procedures and Guidelines, Ver. 1.2

Revision History
Date Version Authors Editors Revision

3/2/99

1.0

1/27/00

1.1

Muhammad Alazari Jason Burkart Ray Carbone John Castonia Brenna Hall Dennis Helm Jonathan Hutcheson Sandra Martin Jay Patel Charles Reisman Dwaine Spresney Jason Burkart

John Castonia Paul Venizelos

First Release (without Chapter 11)

Jason Burkart

5/16/00

1.2

Jason Burkart

Removal of references to Vol.2 and mailto links for J.Castonia Removal of PRM note for link to be added (p.29). NSS as author, removed NES.

Motorola Confidential Proprietary

Optimization Procedures and Guidelines, Ver. 1.2

TABLE OF CONTENTS
1.0 INTRODUCTION ............................................................................................................................. 13 PURPOSE ......................................................................................................................................... 13 ORGANIZATION OF THIS DOCUMENT ............................................................................................... 14

1.1 1.2 2.0

NETWORK DESIGN VERIFICATION/REVIEW ....................................................................... 17

2.1 DESCRIPTION ................................................................................................................................... 17 2.2 TOOLS REQUIRED ............................................................................................................................ 18 2.3 PERSONNEL REQUIRED .................................................................................................................... 21 2.4 ENTRANCE CRITERIA....................................................................................................................... 21 2.5 PROCEDURE..................................................................................................................................... 25 2.6 ANALYSIS CONDUCTED ................................................................................................................... 27 2.7 EXIT CRITERIA ................................................................................................................................ 30 2.8 RECENT DEVELOPMENTS ................................................................................................................ 30 APPENDIX 2A: NEW DEVELOPMENTS IN SIMULATION DOMAIN OPTIMIZATION ................................... 31 APPENDIX 2B: SAMPLE PROBLEM RESOLUTION MATRIX (PRM)......................................................... 50 3.0 EQUIPMENT INSTALLATION AND TEST ................................................................................ 51

3.1 DESCRIPTION ................................................................................................................................... 51 3.2 TOOLS REQUIRED ............................................................................................................................ 52 3.3 PERSONNEL REQUIRED .................................................................................................................... 52 3.4 ENTRANCE CRITERIA....................................................................................................................... 53 3.5 PROCEDURE..................................................................................................................................... 53 3.6 ANALYSIS CONDUCTED ................................................................................................................... 54 3.7 EXIT CRITERIA ................................................................................................................................ 54 APPENDIX 3A: ITP CHECKLISTS ............................................................................................................... 55 4.0 DATABASE VERIFICATION ........................................................................................................ 63

4.1 DESCRIPTION ................................................................................................................................... 63 4.2 TOOLS REQUIRED ............................................................................................................................... 65 4.2 PERSONNEL REQUIRED .................................................................................................................... 66 4.4 ENTRANCE CRITERIA....................................................................................................................... 66 4.5 PROCEDURE..................................................................................................................................... 66 4.6 ANALYSIS CONDUCTED ................................................................................................................... 71 4.7 EXIT CRITERIA ................................................................................................................................ 71 4.8 RECENT DEVELOPMENTS ................................................................................................................ 71 APPENDIX 4A: SHOW_ALLPARMS USAGE ............................................................................................. 72 APPENDIX 4B: PROCEDURE TO EVALUATE TRANSCODER PARAMETERS .............................................. 74 5.0 SPECTRUM CLEARING, NOISE FLOOR TEST VERIFICATION, AND NOISE MONITORING........................................................................................................................................... 78 5.1 5.2 5.3 5.4 5.5 5.6 5.7 5.8 6.0 DESCRIPTION ................................................................................................................................... 78 TOOLS REQUIRED ............................................................................................................................ 80 PERSONNEL REQUIRED .................................................................................................................... 80 ENTRANCE CRITERIA....................................................................................................................... 80 PROCEDURE..................................................................................................................................... 81 ANALYSIS CONDUCTED ................................................................................................................... 82 EXIT CRITERIA ................................................................................................................................ 83 RECENT DEVELOPMENTS ................................................................................................................ 83

TOOLS SELECTION, INSTALLATION, AND TEST................................................................. 86 DESCRIPTION ................................................................................................................................... 86 TOOLS REQUIRED ............................................................................................................................ 89

6.1 6.2

Motorola Confidential Proprietary

ii

Optimization Procedures and Guidelines, Ver. 1.2


6.3 PERSONNEL REQUIRED ....................................................................................................................... 89 6.4 ENTRANCE CRITERIA....................................................................................................................... 90 6.5 PROCEDURE..................................................................................................................................... 90 6.6 ANALYSIS CONDUCTED ................................................................................................................... 92 6.7 EXIT CRITERIA ................................................................................................................................ 93 APPENDIX 6A: TOOLS REFERENCES ........................................................................................................ 94 7.0 SINGLE CELL FUNCTIONAL TEST (SCFT) ............................................................................. 98

7.1 DESCRIPTION ................................................................................................................................... 98 7.2 TOOLS REQUIRED .......................................................................................................................... 100 7.3 PERSONNEL REQUIRED .................................................................................................................. 100 7.4 ENTRANCE CRITERIA..................................................................................................................... 100 7.5 PROCEDURE................................................................................................................................... 101 7.6 DATA ANALYSIS PROCEDURES...................................................................................................... 111 7.7 EXIT CRITERIA .............................................................................................................................. 119 APPENDIX 7B: SAMPLE CALL SAMPLING DATA LOG SHEET .............................................................. 121 APPENDIX 7C: SINGLE CELL FUNCTIONAL TEST TRACKING SHEET .................................. 122 APPENDIX 7D: .TIM FILE HEADER (DESCRIPTION OF .TIM FILE DATA CONTENTS) ........................... 123 APPENDIX 7E: COMPAS IS-95 MESSAGING ACRONYMS.................................................................. 124 8.0 INITIAL COVERAGE TEST ........................................................................................................ 126

8.1 DESCRIPTION ................................................................................................................................. 126 8.2 TOOLS REQUIRED .......................................................................................................................... 128 8.3 PERSONNEL REQUIRED .................................................................................................................. 129 8.4 ENTRANCE CRITERIA..................................................................................................................... 129 8.5 PROCEDURE................................................................................................................................... 129 8.6 ANALYSIS CONDUCTED ................................................................................................................. 139 8.7 EXIT CRITERIA .............................................................................................................................. 142 APPENDIX 8.A: SAMPLE DIRECTORY STRUCTURE .................................................................................. 143 9.0 RF NETWORK OPTIMIZATION................................................................................................ 145

9.1 DESCRIPTION ................................................................................................................................. 145 9.2 TOOLS REQUIRED .......................................................................................................................... 146 9.3 PERSONNEL REQUIRED .................................................................................................................. 148 9.4 ENTRANCE CRITERIA..................................................................................................................... 148 9.5 PROCEDURE .............................................................................................................................. 148 9.6 ANALYSIS ...................................................................................................................................... 150 9.7 EXIT CRITERIA .............................................................................................................................. 188 APPENDIX 9A CHANGE REQUEST FORMS AND CHANGE ORDERS ........................................................... 189 10.0 FINAL COVERAGE SURVEY & WARRANTY VERIFICATION..................................... 192

10.1 DESCRIPTION ............................................................................................................................ 192 10.2 TOOLS REQUIRED ..................................................................................................................... 193 10.3 PERSONNEL REQUIRED ............................................................................................................. 193 10.4 ENTRANCE CRITERIA ................................................................................................................ 193 10.5 PROCEDURE .............................................................................................................................. 194 10.6 ANALYSIS CONDUCTED ............................................................................................................ 196 10.7 EXIT CRITERIA: ........................................................................................................................ 197 APPENDIX 10A .................................................................................................................................... 198 11.0 11.1 11.2 11.3 11.4 11.5 SYSTEM OPERATIONS .......................................................................................................... 201 DESCRIPTION ............................................................................................................................ 201 TOOLS REQUIRED: .................................................................................................................... 203 PERSONNEL REQUIRED: ............................................................................................................ 204 ENTRANCE CRITERIA:............................................................................................................... 204 PROCEDURE:............................................................................................................................. 204

Motorola Confidential Proprietary

iii

Optimization Procedures and Guidelines, Ver. 1.2


11.6 11.7 ANALYSIS CONDUCTED:........................................................................................................... 206 EXIT CRITERIA: ........................................................................................................................ 206

APPENDIX A ROLES AND RESPONSIBILITIES.............................................................................. 210 A.1 A.2 A.3 A.3 A.4 A.5 A.6 A.7 A.8 A.9 A.10 A.11 WHITE BELT (SYSTEM ENGINEER ENTRY LEVEL) ....................................................................... 210 GREEN BELT (SYSTEM ENGINEER) ................................................................................................ 211 BLUE BELT (SYSTEM ENGINEER) .................................................................................................. 213 BLACK BELT (SYSTEM ENGINEER)................................................................................................ 214 DIAGNOSTIC MONITOR (DM) OPERATOR ..................................................................................... 214 LANDLINE OPERATOR ................................................................................................................... 215 DRIVER.......................................................................................................................................... 215 BRIDGE OPERATOR ....................................................................................................................... 215 CBSC/SWITCH ENGINEER ............................................................................................................. 215 CFE............................................................................................................................................... 216 DATABASE ENGINEER............................................................................................................. 216 DEVELOPMENT SUPPORT ........................................................................................................ 216

APPENDIX B HARDWARE/SOFTWARE ........................................................................................... 217 APPENDIX B-1: CHECK LIST FOR METRIC OPERATORS ........................................................................... 218

Motorola Confidential Proprietary

iv

Optimization Procedures and Guidelines, Ver. 1.2

LIST OF FIGURES
Figure 2.1-1: Relationship of Network Design Verification to Entire Optimization Process....................................................................................................................... 17 Figure 2.4-1: Design Review Check List (Page 1)........................................................... 23 Figure 2.4-1: Design Review Check List (Page 2)........................................................... 24 Figure 2.6-1: Simulation Prediction of Coverage Areas .................................................. 28 Figure 2.6-2: Final Checklist for System Acceptance Criteria ......................................... 29 64 Figure 4.1-1: Relationship of Database Verification Activity to Entire Optimization Process....................................................................................................................... 64 Figure 5.1-1: Relationship of Spectrum Clearing Activity to Entire Optimization Process ................................................................................................................................... 79 Figure 5.8-1: Output of banditview script......................................................................... 85 Figure 6.1-1: Relationship of Tools Selection, Installation and Test Activity to Entire Optimization Process................................................................................................. 87 Figure 6.1-2: Tools Overview ........................................................................................... 88 Figure 7.1-1: Relationship of Single Cell Functional Test Activity to Entire Optimization Process....................................................................................................................... 98 Figure 7.5.1.1-1: Sample SCFT drive route map for Method 1. .................................... 103 Figure 7.5.1.1-2: Sample SCFT drive route map for Method 2 ..................................... 104 Figure 7.5.1.1-3: Sample Soft Handoff drive route map................................................. 104 Figure 7.5. 2-1: Block Diagram of a Typical CDMA Drive Test Van Setup. ............... 105 Figure 7.6.1.2-1: Example of a Browsed CDLLOG (Start) ........................................... 112 End of Figure 7.6.1.2-1: Example of Browsed CDLLOG ............................................. 114 Figure 7.6.3-1: PN Plot for Site 106, Sector 6: .............................................................. 117 Figure 7.6.3-2: PN Plot for Site 106, Sector 1 ............................................................... 118 Figure 8.1-1 Relationship of Initial Coverage Test Activity to Entire Optimization Process..................................................................................................................... 127 Figure 8.5.4-1: PMMCC Report ..................................................................................... 134 Figure 8.5.4-2: CEM Report ........................................................................................... 135 8.5.4-2B Device Outage and Alarm Listing.................................................................... 136 8.5.4-2C Alarm Summary ............................................................................................... 137 Figure 9.5-1: Overall Optimization Flow....................................................................... 149 Figure 9.6.3-1: PN offset plan (text file)........................................................................ 160 Figure 9.6.3-2: PN Output Plot in Compas .................................................................... 161 Figure 9.6.3-3: NetPlan Path Profile Plot........................................................................ 165 Figure 9.6.3-4: COMPAS Plot Illustrating an Overshooting PN................................... 166 Figure 9.6.4-1 System Architecture Overview................................................................ 169 Figure 9.6.4.2-1: Drive Test Log Sheet to correlate with CFC 9 problem................... 174 BROWSE CDLLOG ....................................................................................................... 175 Figure 9.6.4.2-2: CDL Log Correlating to Drive Team Log Sheet.............................. 175 Figure 9.6.5-1: Excerpt from esn.t20 from CAT............................................................ 180 Figure 9.6.5-2: Excerpt from esn.t20 for bad mobile ................................................. 181 Figure 9.6.5-3: Excerpt from call_dur.dst from CAT .................................................... 182

Motorola Confidential Proprietary

Optimization Procedures and Guidelines, Ver. 1.2 Figure 9.6.6.1.1-1: Example of Error Window in Compas ............................................. 184 Figure 9.6.6.1.1-2: Example of Missing Data ................................................................. 184 Figure 10.1-1: Relationship of Final Coverage Survey and Warranty Testing Activity to Entire Optimization Process.................................................................................... 192

Motorola Confidential Proprietary

vi

Optimization Procedures and Guidelines, Ver. 1.2

LIST OF TABLES
TABLE OF URLs ............................................................................................................... 9 Table 2.2-1: Candidate Network Planning and Simulation Tools .................................... 19 Table 2.2-2: NetPlan Reference Documentation............................................................... 20 Table 2.2-3: NetPlan CDMA Static Simulator Documentation........................................ 20 Table 2.3-1: Personnel Required...................................................................................... 21 Table 2.6.1: Simplified Problem Resolution Matrix for Simulation Prediction ............... 28 Table 3.3-1: Personnel Required....................................................................................... 53 Table 4.2-1: Database Verification Tools and References................................................ 65 Table 4.3-1: Personnel Required....................................................................................... 66 Table 5.2-1: Interference Isolation Tools......................................................................... 80 Table 5.3-1: Personnel Required...................................................................................... 80 Table 6.2-1: Tools Required To Conduct CDMA Optimization Tools Survey ............... 89 Table 6.3-1 Personnel Required ........................................................................................ 89 Table 6.5.2-1: Sample Tools Evaluation Spreadsheet...................................................... 92 Table 6A: Motorola Developed Tools & Products .......................................................... 94 Table 7.2-1: Tools Required............................................................................................ 100 Table 7.3-1: Personnel Required..................................................................................... 100 Table 7.5-1: Tools Required For SCFT Data Analysis.................................................. 102 Table 7.5.3.3-1 Channel Verification for 3-sector MCC 16. ......................................... 108 Table 7.6.1.2-1: Entry Type Definitions for CDLs ........................................................ 112 Table 8.2-1: Tools Required for Initial Coverage Test.................................................... 128 Table 8.3-1: Personnel Required..................................................................................... 129 Table 8.6.1.1: Relationship Between Number of Pilots Serving an Area and Acceptable Mobile Receive Signal Strength.............................................................................. 140 Table 9.2-1: Tools Required for RF Network Optimization ........................................... 147 Table 9.3-1: Personnel Required..................................................................................... 148 Table 9.6.3-1: Pilot Analyzer Output .............................................................................. 161 Table 9.6.3-2: Data table for Non-Dominant Pilots ....................................................... 163 Table 9.6.4.1-1: Drive Team Problem Reports and Likely Causes................................. 170 Table 9.6.4.2-1: Normal CFC distribution.................................................................. 172 Table 9.6.4.2-2: Optimization Problem Troubleshooting Table (start).......................... 176 Table 9.6.4.2-2: Optimization Problem Troubleshooting Table (finish) ....................... 177 Table 9.6.4.4-1 Problems Seen and Escalation Procedures ............................................ 179 Table 10.2-1: Tools Required for Final Coverage Survey/Warranty Verification ......... 193 Table 10.3-1: Personnel Required................................................................................... 193

Motorola Confidential Proprietary

vii

Optimization Procedures and Guidelines, Ver 1.2

TABLE OF URLs URL


http://www.cig.mot.com/cdma_ase/index.html

Description
For simulations, default parameters listed in [Matt Dillons] release-specific spreadsheets should be used as inputs; choose the link for the target release. NetPlan reference documentation, including the CDMA System Static Simulator. Follow the steps under "How to Order Manuals and CD-ROMs". The manual RF Engineering Users Manual shows the different images in NetPlan; manual number is 68P09245A02-O. CDMA RF System Design Procedure and CDMA RF Planning Guide - procedures and guidelines for the network design activity. Select the RF Planning button. Technical reference documents on the usage of the CDMA System Static Simulator. Select the RF Planning button. Drive Test Procedures and Xlos Tuning Using Drive Test Data document. Select the RF Planning button. NetPlan product groups home page. Network planning and simulation tool by Motorola. 3rd party network planning and simulation tool. 3rd party network planning and simulation tool. 3rd party network planning and simulation tool. 3rd party network planning and simulation tool. 3rd party network planning and simulation tool. 3rd party network planning and simulation tool. Download a sample of a Problem Resolution Matrix (PRM). Information on SAFCOs design tool, Wizard. Candidate network planning and simulation tool. Information regarding NetPlan training. BTS Optimization/ATP procedure manuals for various BTS models.

Category
Reference

Chapter
2,9

http://www.cig.mot.com/TED/docs.html

Reference

http://www.cig.mot.com/TED/docs.html

Reference

http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices. html http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices. html http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices. html http://www.sesd.cig.mot.com http://ww.glenayre.com http://www.lucent.com/ http://www.lucent.com/ page 16 http://www.msi-world.com/home.html http://www.primeco.com/ http://www.uswest.com/ Not sure where this will go yet http://www.safco.com http://www.qualcomm.com http://www.cig.mot.com/TED/Training/training.html http://www.cig.mot.com/TED/docs.html

Reference

2,9

Reference Reference Tools Tools Tools Tools Tools Tools Tools Tools Tools Tools Training Reference

2 2 2 2 2 2 2 2 2 2 2 2 2 3

Optimization Procedures and Guidelines, Ver 1.2

http://www.cig.mot.com/cdma_ase/index.html http://www.cig.mot.com/cdma_ase/index.html

http://www.cig.mot.com/TED/docs.html

http://www.pamd.cig.mot.com/nds/cts/rftech/App_Notes/icsho/ http://www.pamd.cig.mot.com/nds/cts/rftech/App_Notes/icsho/icsho. html#fyis http://www.rochellepark.pamd.cig.mot.com/software.html http://www.cig.nml.mot.com/~spresney/Compas_NL/Compas_NL.ht ml

Spreadsheet containing recommended default parameter settings for each software release. Matt Dillons Parameter and Optimization Guide. Reference to all system commands, their syntax and sample outputs. Click on the hyperlink "Online Product Documentation", choose the Supercell button, then the SC Product Family-CDMA button. Click on the hyperlink "OMCR/CBSC/SYSTEM" and then scroll down to "System Commands Reference". Tables required for Inter-CBSC Soft and anchor handoff configurations. Select the hyperlink entitled icsho_CAN_v0_1.fm. Where to check for FYI's concerning ICBSC-SHO. Script that graphically displays neighbor lists and various parameter settings. Path to download the script Compas_NL which generates a file that can be read into NetPlan so the neighbor list can be displayed graphically. Java-based, database visualization tool that provides insight into the contents of the MIB. The show_allparms script that compares the installed MIB to the recommended default RF parameter settings and generates a report of the differences. Click on the scripts button and choose the correct version. Procedure for performing noise floor testing entitled CDMA Uplink Noise Survey Procedure under the RF Planning option. A tool used to detect and isolate noise or interferenceinduced problems. Information about the CLI command browse cdllog in the chapter titled Event Management. Tool that is recommended for the RF optimization of a CDMA system. Input is HP Pilot Scanner Data. Tool that is recommended for the RF optimization of a CDMA system. Generates neighbor list

Reference Reference

4 4

Reference

Reference Reference Tools Tools Tools

4 4 4 4 4

http://www.pamd.cig.mot.com/~toolprod/falcon
http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices. html http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices. html http://scwww.cig.mot.com/~thakkar/smap.html http://www.cig.mot.com/ted/EXT_WEB/TED/pdf/english/R7pdf_no f/226A24GO/226A24GO.PDF http://engdb.tlv.cig.mot.com/tools/PilotAnalyzer.html http://www.cig.mot.com/~spresney/sho_time/sho_time.html

Tools

4,10

Related Process Tools Reference Tools Tools 6 6,9 6,9

5 5

10

Optimization Procedures and Guidelines, Ver 1.2

http://www.cig.mot.com/~reimrsrr/NLP.html

http://www.cig.nml.mot.com/~spresney/CT_neighbor_scripts.html http://www.cig.mot.com/~spresney/sho_time/sho_time.html http://www.sesd.cig.mot.com/compas/

http://www.hp.com/go/drive_test/
http://www.qualcomm.com/cdma/optimization/ http://www.qualcomm.com/cda/technology/display/0,1476,1_21,00. html http://www.rsd.de/produkt/215a.htm http://www.rsd.de/produkt/tm_mob.htm http://www.tmo.hp.com/tmo/datasheets/English/HPE7472A.html http://www.global.anritsu.com/products/test/rfmicrowireless/MT880 2A.html www.grayson.com http://www.rochellepark.pamd.cig.mot.com/software.html http://www.safco.com

recommendations from DM data. Tool that is recommended for the RF optimization of a CDMA system. Generates neighbor list recommendations from CDLs. Tool that is recommended for the RF optimization of a CDMA system. Generates neighbor list recommendations from scanner data. Alternate location for sho_time tool. Generates neighbor list recommendations from DM data. Tool that is recommended for the RF optimization of a CDMA system. Post-processing tool using DM data and SMAP data. Information on the Hewlett Packard pilot scanner. Windows based DM from Qualcomm. A listing of CDMA licensed suppliers from Qualcomm. Information about a DM from Rohde & Schwarz, Inc. Alternate location for information about a DM from Rohde & Schwarz, Inc. Hewlett Packard E7472A CDMA Integrated RF and call performance coverage test system. Information regarding Anritsu Company (Radio Communication Analyzer). Information about Graysons wireless PN scanner and wireless analyzer. Script to install PM reports based on PM statistics generated on the OMCR. 3rd party tool that may be considered as a candidate for various optimization activities.

Tools

Tools Tools Tools Tools Tools Tools Tools Tools Tools Tools Tools Tools Tools

6 6,9 6 6 6 6 6 6 6 6 6 6 6

11

Optimization Procedures and Guidelines, Ver 1.2

http://scwww.cig.mot.com/SC/mgmt/tools/CDMA/Test_Tools/CAM PS/index.html http://www.cig.mot.com/~wheelrts/analyzer.html http://www.cig.mot.com/~klnknbrg/r5cfcdocument.html http://www.cig.mot.com/~wheelrts/analyzer.html http://www.safco.com/measurement/walkabout.html http://www.cig.nml.mot.com/cdma/kctopt/tools/ http://www.trimble.com http://www.cig.mot.com/Organization/TED/Documentation/Tools/to ols.html http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices. html http://www.cig.mot.com/~thakkar/smap.html. http://scwww.cig.mot.com/~thakkar/smap.html http://scwww.cig.mot.com/people/cdma/PjM/product/release_info/ http://www.cig.mot.com/~dillon/ http://scwww.cig.mot.com/people/cdma/PjM/product/release_info/ http://scwww.cig.mot.com/~thakkar/smap.htm http//www.rochellpark.pamd.cig.mot.com/software.html http://www.cig.mot.com/~reimrsrr/SOS.html http://www.cig.mot.com/standards/CDMA_STDS/TIA_CDMA_ST DS.html#ai http://www.rochellepark.pamd.cig.mot.com/~dhelm/

Information about CAMPs DM which collects mobile phone data and GPS position data. Information on the CDL Analysis Tool and path to download script to install. Call processing sequences, including call set ups for both originations and terminations. Explanation of the reports/files created by the CDL analysis tool. Information about SAFCOs DM. A Motorola tool capable of viewing a PN plot within the COMPAS tool. Select PN Plot from the Tool Box. Trimble GPS information. Required to supply time/location data for each DM. Information about COMPAS. The document number is COMPAS3.2 68P09248A05-A Auto Dial, call termination script. SMAP installation and configuration notes. SMAP installation and configuration notes for OCNS forward link loading. Product Release information Reference material regarding how to diagnose and solve problems. Information regarding specific releases. Reverse link messaging tool. PMSUM suite of scripts combines the most widely used PM reports into four reports for easy usage. Select the pmsum section on the web page. SOS script looks at the mobile messaging IS-95 and TIA/EIA-95 specifications Path to download ASSIST script that is run on the *.map files created by COMPAS; choose the link titled RF Warranty Tools Page.

Tools Tools Reference Reference Tools Tools Tools Tools Tools Tools Tools Reference Reference Reference Tools Tools Tools Reference Tools

6, 7 6, 7, 9 7 7, 9 7 7 7 7 8 8,10 8,10 8 9 9 9 9 9 9 10

12

Optimization Procedures and Guidelines, Ver 1.2 Introduction

1.0 Introduction
1.1 Purpose
The purpose of this document is to consolidate and provide CDMA RF network optimization engineers with the necessary information to optimize a CDMA system. This guidebook frequently refers to other reference materials, providing the necessary links to useful web sites, both internal and external to Motorola, as the intent of this is not to reinvent any work already completed. To begin planning for the optimization process, the market manager and/or lead optimization engineer should determine the resources available and the resources needed for each step of the process. There is a deployment plan template found at http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices.html, click on the Project Planning button, click on the link CDMA Project Plan IDP Template and then on the link CDMA Project Plan Template v5. This template will walk the market manager through each step of planning the project and will provide general guidelines on the number of resources needed for each step. The network optimization process focuses on getting the network ready for commercial launch. Typically, this testing is done under an unloaded condition. Primary objectives of network optimization are to identify and eliminate any hardware and database implementation errors and arrive at a set of optimal operating parameters and equipment settings (e.g. antenna tilts, azimuths, and SIF power settings) to provide an acceptable level of performance. That acceptable level of performance can be specified and measured in terms of a combination of any of the following: - coverage area: measured in terms of Mobile Receive Power, adequate Ec/Io, and/or Mobile Transmit Power, and/or - voice quality criteria: measured in terms of Frame Erasure Rate (FER) on the forward and/or reverse links, and/or - target call completion and call drop rates Specific targets such as those listed above for acceptable levels of performance are mandated by the contract. Market managers and the lead system engineer of the optimization team should thoroughly review the contract and know the deadlines and penalties. They should also have a copy of the contract for reference. In most instances, customers will want to compare the coverage of the deployed network to the original predicted coverage generated with a network-planning tool. Portions of this optimization process are iterative, collecting and analyzing drive test data to converge to an optimal set of operating parameters. Much of the material in this document will be very familiar to practicing optimization engineers, and many engineers may have in fact already contributed to previous, similar documents. What this guidebook focuses on is coordinating and cross-referencing the 13

Optimization Procedures and Guidelines, Ver 1.2 Introduction wealth and diversity of information from various departments and markets, ranging from basic optimization processes to discussion of more advanced product features (e.g. multi-carrier). Each update will attempt to standardize the optimization practice, capture recent developments in optimization practices and strategies in various markets, and present and discuss recent tool developments as reported back by the readers and users of this document. References to recent developments in the area of network optimization are contained in shaded boxes in this document.

1.2

Organization of this Document

Optimization Procedures and Guidelines, discusses the optimization process (what needs to be done), and provides guidelines (entrance and exit criteria) for each step in the process. Where possible, tools to support each activity are introduced and analysis methods are discussed. The organization of this document follows the structure laid out in Figure 1.2-1, Overview of Network Optimization Activities. Emphasis is given to Chapters 6 through 10, and new developments affecting optimization in other areas. Following along with Figure 1.2-1, a description of the contents of each chapter is provided. Chapter 2: Network Design Verification: Familiarizes the optimization engineer with the network design conducted to date. Identifies and documents predicted problem locations. Chapter 3: Equipment Installation And Test Verification: Facilitated coordination between the optimization engineering team and CFE crews to verify that the BTSs for each cluster are operational and properly integrated with the CBSCs prior to start of single cell functional test. Chapter 4: RF Parameters Database Verification: Checks all RF related parameters, neighbor lists and supporting tables, and transcoder parameters. Chapter 5: Spectrum Clearing: Ensures that the spectrum in the area of network operation stays adequately cleared during network optimization activities. Focuses on use of alarms and mobile data to identify noise rise conditions. Chapter 6: Data Collection and Analysis Tools Selection, Install, Test: Guidelines for selection and implementation of optimization tools for a specific market. Includes references to Motorola and third party vendor tools. Data collection tools include mobile diagnostic monitors (DM), pilot analyzers, System Monitoring Application Processor (SMAP), Call Detail Logs (CDLs), PM Data. Data analysis tools include COMPAS, OPAS, CAT (CDL Analysis Tool), PM Reports and scripts to be executed at the CBSC/OMC-R. Chapter 7: Single Cell Functional Test: Use of a mobile diagnostic monitor (DM) and test phones to generate test calls. Specifies drive route definition, pre-departure checks, data 14

Optimization Procedures and Guidelines, Ver 1.2 Introduction collection, and data analysis procedures using both CDLs and DM (mobile messaging) data. Chapter 8: Initial Coverage Test: Makes use of an initial coverage test in each cluster to baseline system performance and to identify problems. Problems are recorded in a Problem Resolution Matrix (PRM) which will be used during the system optimization process to track progress and communicate issues to the customer. Chapter 9: System Optimization And Detailed Problem Resolution: After each clusters performance has a baseline establish, optimization and detailed problem resolution within each cluster can proceed until the acceptable level of performance is achieved. This chapter discusses the use of data and analysis tools used during the system optimization process. The intent of the optimization process is to isolate and resolve all of the issues that will impact the overall quality of the CDMA system. This may also include intercluster, inter-CBSC, and inter-EMX handoff issues. Chapter 10: Final Coverage Survey and Warranty Testing: Presents information for the final performance drive and simultaneous collection and processing of adequate data to demonstrate satisfactory compliance to contractual or warranty clauses. Chapter 11: System Operations: Chapter 11 covers network performance monitoring and expansion. The bulk of this material is covered in a separate, companion document, but is introduced here in the context of the Friendly Users trial. The primary benefits of this Friendly User trial are the additional identification of faulty hardware throughout the network, and reports from customers identifying problem areas that may not have been traversed during the original optimization.

15

Optimization Procedures and Guidelines, Ver 1.2 Introduction


Optimization Preparation
Equipment Installation and Test Verification (Chapter 3)

Network Design Verification (Chapter 2)


Accurate Terrain, Clutter, Xlos Tuning Data System Design via NetPlan/CSSS

RF Parameters Database Verification (Chapter 4)

Spectrum Clearing (Chapter5)

Data Collection and Analysis Tools Selection, Install, Test (Chapter 6)

Network Optimization Single Cell Functional Test (Chapter 7) Initial Coverage Test (Chapter 8) System Optimization and Detailed Problem Resolution (Chapter 9) Final Coverage Survey and Warranty Testing (Chapter 10)

System Operations (Chapter 11)

Commercial Service: Network Performance Monitoring and Expansion (Chapter 11)

Figure 1.2-1: Overview of Network Optimization Actives

16

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review

2.0 Network Design Verification/Review


2.1 Description

The initial network design activity employs simulation tools, such as NetPlan, and follows specific system design planning guidelines to predict and assess network performance based upon placement of BTS locations across a network coverage area. Figure 2.1-1 shows that this process is actually the first step of the optimization cycle. The optimization team should participate in the system design review of this simulation work to help orient the team to potential problems predicted by the simulation tool.

Optimization Preparation
Equipment Installation and Test Verification (Chapter 3)

Network Design Verification (Chapter 2)


Accurate Terrain, Clutter, Xlos Tuning Data System Design via NetPlan/CSSS

RF Parameters Database Verification (Chapter 4)

Spectrum Clearing (Chapter5)

Data Collection and Analysis Tools Selection, Install, Test (Chapter 6)

Network Optimization Single Cell Functional Test (Chapter 7) Initial Coverage Test (Chapter 8) System Optimization and Detailed Problem Resolution (Chapter 9) Final Coverage Survey and Warranty Testing (Chapter 10)

System Operations (Chapter 11)

Commercial Service: Network Performance Monitoring and Expansion (Chapter 11)

Figure 2.1-1: Relationship of Network Design Verification to Entire Optimization Process

17

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review The primary objective of the design review is to evaluate the system design to ensure that it meets customer requirements for coverage and capacity, and to generate an initial assessment of where problems may be encountered in the deployment and RF optimization of that system. Many times this activity is coupled with contractual warranty development. Since the skill set required to do a system design using NetPlan (or other tools) is fairly specialized, it is very infrequent that the optimization engineer is the same person as the simulation guru. The design review provides an opportunity to communicate critical network design issues to the optimization team. It is desirable to maintain and use this simulation work throughout the system optimization process. The network planning tools should be used to play the what if games necessary to evaluate proposed changes intended to improve coverage or resolve other RF related issues. Such changes may include antenna pointing angles and SIF pilot powers. The procedures and guidelines for the network design activity are beyond the scope of this chapter; however, they are contained in the following documents: CDMA RF System Design Procedure CDMA RF Planning Guide Both of these documents can be found by selecting the RF Planning button at http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices.html. Follow the directions for downloading or ordering a bound copy on this web site. Recent developments in the area of improving simulation accuracy include a new autooptimizer tool. This is documented in Section 2.8 and Appendix 2A of this chapter.

2.2

Tools Required

Table 2.2-1 contains a list of candidate network planning and simulation tools along with sources of data (i.e. clutter, terrain, etc.) that would be used by these tools. Only one is required to plan and simulate a CDMA system. The Motorola tool is NetPlan. For information on the NetPlan tool, see the NetPlan product groups home page at http://www.sesd.cig.mot.com. NetPlan is designed to assist engineers in optimizing and implementing wireless networks by providing an accurate and reliable prediction of coverage. With a database that takes into account data such as terrain, clutter, antenna radiation patterns, and traffic modeling, as well as an intuitive graphical interface, NetPlan gives RF engineers a state-of-the-art tool to estimate RF propagation, design wireless networks, plan network expansions, optimize network performance, and diagnose system problems.1 There is also a specialized CDMA System Static Simulator (CSSS). Regarding NetPlan, Table 2.2-2 contains information on reference documentation for this Motorola tool, including the CDMA System Static Simulator. These documents may be ordered from the TED web site at http://www.cig.mot.com/TED/docs.html. Follow the steps under "How to Order Manuals and CD-ROMs".
1

Description of NetPlan taken from the web page.

18

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review Tool Name NetPlan Propagation Engine and NetPlan CDMA Static System Simulator (CSSS) Informix (required by NetPlan) Auto Optimizer Tool NEW Feature Request # 00019504 (uses output of NetPlan) Description and Vendor SOFTWARE: The NetPlan application automates many cellular system planning and analysis functions. Ordering information can be found at http://www.sesd.cig.mot.com/order.html. Database Software, Informix Automated tool that uses predictive methods to optimize any combination of antenna azimuths, downtilts, and/or pilot powers in a CDMA system. Tool and users manual by Charles Reisman. See Appendix 2.A for a copy of the Users Guide and an email of beta test results. Simulation Tool, More information can be obtained from LCC, 7925 Jones Branch Drive, McLean, VA 22102, USA. Phone: (703) 873-2000 DOS Simulation Tool. More information can be obtained from Lucent's web site at, http://www.lucent.com/. Simulation Tool. For more information, see the web sites for USWest or Primeco at http://www.uswest.com/ or http://www.primeco.com/.

CellCAD

CE4

Cell Designer

Planet

Simulation Tool, More information can be found at the MSI web site: http://www.msi-world.com/home.html. QEDesign Simulation and Planning Tool, See the Qualcomm web site for more information at http://www.qualcomm.com. WiNGS Simulation Tool, More information about Glenayre and their products can be found at http://ww.glenayre.com. Wizard Simulation Tool, Information on SAFCOs design tool can be found at http://www.safco.com. Motorola, Inc., GIS & Remote Source of input data (clutter, terrain, road networks, Sensing Center of Excellence etc.). More information can be found on their web page at http://www.imd.cig.mot/rm/gis.html. CRC Reserch Institute, Inc. Source of input data (clutter, terrain, road networks, 2-7-5, Minamisuna, Koto-ku, etc.) for Japan. Tokyo 136 Table 2.2-1: Candidate Network Planning and Simulation Tools Name Basic Concepts User's Guide Order Number 68P09245A01-O Overview Combines reference information with procedures for using NetPlan.

19

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review RF Engineering Users Manual Hardware/Software Installation Guide Data Formats and Conversion Guide Tutorial Explains the different elements of NetPlan and how they are used. 68P09245A04-O Describes how to install and configure NetPlan hardware and software on HP and Sun systems. 68P09245A07-O Describes file formats and how to use the conversion utility programs. 68P09245A08-O Provides quick, hands-on introduction to NetPlan. System Administration 68P09245A03-O Intended to provide NetPlan System Guide Administrators with information and procedures necessary for effective and efficient maintenance of the NetPlan system. CDMA Static System 68P09245A09-O Operators manual for the CDMA Static System Simulator Users Guide Simulator feature of NetPlan. New Features and Release 68P09245A06-O Overview of the new features and Notes enhancements of NetPlan. Table 2.2-2: NetPlan Reference Documentation Technical reference documents on the usage of the CDMA System Static Simulator are listed in Table 2.2-3 and can be found by choosing the RF Planning button at http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices.html Document Name "NetPlan 3.0 CDMA Simulator: A Technical Overview" Author Ashish Kaul Overview This document will provide a technical overview of the key features, components and outputs of the CDMA simulator in NetPlan. ISI Settings within NetPlan 3.0 CSSS 68P09245A02-O

"ISI Settings within NetPlan Bob Love/Jim Panwell 3.0 CSSS" "T_ADD/T_DROP CSSS Bob Love/Jim Panwell T_ADD/T_DROP CSSS Settings vs. Settings vs. Field Settings" Field Settings Table 2.2-3: NetPlan CDMA Static Simulator Documentation Finally, additional information can be obtained, and specific questions can be answered about the NetPlan family of tools (NetPlan, CSSS, and COMPAS) by subscribing to the NetPlan posting group. Put any of the following messages in the main body of an email subscribe np-prod subscribe np-cdma subscribe np-compas

and send the message to the following email address to subscribe (as desired): Email address: majordomo@sesd.cig.mot.com

20

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review

2.3

Personnel Required
Skill Level White Belt (See Appendix A) Table 2.3-1: Personnel Required

Type RF System Engineer

Special training is required for personnel conducting the system design activity. Motorola provides comprehensive training classes on the NetPlan Tools that are offered by the Technical Education and Documentation Center (TED). The class numbers and names are: PER 300 NetPlan 3.1 PER 310 Compas PER 130 NetPlan CDMA (for the Static Simulator) Upon completion of these NetPlan courses, students are capable of using the NetPlan RF planning tool in cellular system planning, design, and optimization. TED can be contacted at 1-847-435-5700 or at http://www.cig.mot.com/TED/Training/training.html for course dates and availability.

2.4

Entrance Criteria

In order to start the design review the following activities should be completed as part of the system design activity: 1. NetPlan Tool Installation Before the NetPlan tool can be used, certain tasks must be completed. It is assumed that the system administrator tasks have been completed. Users will enter a basic system plan, where they will need to know the number of switches needed to handle traffic in that area and the number and type of sites needed to handle traffic in that area. Configurators are available for various products to help the user plan the system. The configurators and read_me files can be found at http://www.acpg.cig.mot.com/w3/apd/PIOI/new/configurator.html. For more information, see "NetPlan RF Planning Tool, Student Guide" available from the NetPlan 3.1 class offered by TED. See the TED web page for information on when the course is available and how to register. 2. Network Planning Data A. Prior to the design of a CDMA system, network planning information such as propagation parameters, subscriber profile, call model (busy hour call completion, Erlangs per subscriber, average hold time per access), terrain data, and land use/land clutter data, should all be available or estimated in adequate fidelity and resolution to properly represent the network to be simulated. [NOTE: Any simulator will suffer from the garbage in garbage out syndrome. Therefore it 21

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review is critical to ensure that the data is as accurate as possible.] Settings of important parameters and tips on how to address them during system design are listed in the "CDMA RF Planning Guide". B. When simulating the system, the default parameters listed in [Matt Dillons] release-specific spreadsheets should be used as inputs. These spreadsheets can be obtained from http://www.cig.mot.com/cdma_ase/index.html, choose the link for the target release. 3. Propagation Model Tuning Using XLOS Data Data should be collected through drive tests to improve the accuracy of (or calibrate) the NetPlan Xlos propagation model. For information on conducting this drive test, verifying the drive test data, and then using the drive test measurements to tune the NetPlan Xlos model see "Drive Test Procedures and Xlos Tuning Using Drive Test Data" found by selecting the RF Planning button located at http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices.html. 4. Design Outputs should be available. These are listed below in Figure 2.4-1, in the form of a Design Review Checklist. The items in this checklist should be available for review during the design review.

22

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review Figure 2.4-1: Design Review Check List (Page 1)

Coverage Area (as defined by Customer) ______ Image/Map indicating the Desired Coverage Area ______ Identification of critical regions within Desired Coverage Area Rx Link Budget ______ Link Budget(s) used for Rx Pathloss Coverage Plot. Should be supplied on a per site/sector basis if there is variation in antenna gain, line loss from site to site and sector to sector. ______ Calculations of NetPlan Rx ERP and system level cutoff. NetPlan Pathloss Plots ______ Most likely Server image ______ Receive Voice Power image coverage (unloaded) ______ Coverage Exclusion Mask image (if used) ______ Clutter image (optional) ______ Virtual Obstruction Heights (if not standard) ______ Elevation image (optional) ______ Propagation boundaries Cell Configuration data/site/sector ______ Antenna Configuration: type, height, bore, tilt, gain, beam, width, ERP, eff, gain, cell Noise Figure and ambient noise. ______ Antenna pattern modifications (if used) ______ Pilot _____ Page _____Sync ______ T_ADD ______ T_DROP

______ Tch_Max

_____ Tch_Min

NetPlan CDMA Simulation Parameters: ______ Complete printout of the Basic and Advanced CDMA parameters.

23

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review Figure 2.4-1: Design Review Check List (Page 2)

Simulation Images: ______ Best Ec/Io ______ Soft Handoff State ______ Reverse Required Power BBN Cornerstone Statistics /Sector (50-200 drops): ______ NumMob ______ Links ______Rise ______ Best Ec/Io Server/Sector ______ Forward required Power

______ MobGood/NumMob ______ Total Forward Power Traffic specific information ______ Traffic distribution image

______Soft Handoff Factor

______ # Mobiles

______ Weighting specifics of clutter /roadways, etc. ______ Traffic exclusion Mask image (if used) ISI Information (for 800 MHz systems <1:1 deployment) ______ AMPS only site cell data with ERP used ______ CDMA images with and without presence of ISI Neighbor List ______ Not used ______ Used and applied at the end of optimization

Other general Information: ______ Where is D/A handoff region defined

24

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review

2.5

Procedure

Each of the items in the Design Review Check List should be evaluated with emphasis given to learning about the system design and identifying any problem areas. 2.5.1 Review of Propagation Model Tuning Validation procedures to tune the Xlos propagation model are found in the manual "Drive Test Procedures and Xlos Tuning Using Drive Test Data" by selecting the RF Planning button located at http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices.html. The usage of these procedures should be reviewed, and reasonableness checks for the NetPlan Pathloss Plots should be conducted. 2.5.2 Review of NetPlan Inputs The simulator inputs should be reviewed to assess their accuracy and resolution. The input items listed in Figure 2.4-1, Design Review Check List, consisting of Coverage Area Definition, Rx Link Budget, Cell Configuration Data, and NetPlan CDMA Simulation Parameters should be reviewed. 2.5.3 Review of NetPlan Image Outputs Reference material on NetPlan Simulator Statistical Output and Analysis, NetPlan Cell/Mobile Analysis, and NetPlan Simulator Images Output and Analysis can be found in "CDMA RF System Design Procedure". NetPlan will create a variety of images that quantify various predicted performances of the network. Since it is important for the system optimization team to understand the system design to help guide optimization activities, the images listed below should be examined during the design review. The following images should be inspected to determine any basic design issues: 1. Site Propagation Images: A. Best Signal Strength (Ec/Io): This is the best signal strength of the best server/sector. For each point in the image, the best signal strength is the signal strength of the strongest sector. A coverage report is also included. B. Best Server/Sector: This will show which site or sector is the best server for a given area on a site-by-site or sector-by-sector basis. The best server is the sector with the strongest signal strength at that position. A coverage report and a sector boundary image are included. C. Forward Required Power: At each geographic bin, this image displays the traffic channel power required to be transmitted by the best serving sector/site (as displayed in Best Ec/Io Server/Sector image) to make the forward link with the probe mobile placed in that bin successful. Locations in the system that do not

25

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review have any Best Ec/Io Server/Sector are assigned 70 dBm value by default and bins outside the Combined Image boundary are assigned the value of 100 dBm. D. Reverse Required Power: The Reverse Required Power image displays the true power required to close the reverse link. If the reverse required power exceeds the maximum power the mobile can transmit, this will create a coverage problem on the reverse link. At each geographic bin, this image displays the traffic channel power required to be transmitted by the probe mobile to close the reverse link with the best serving sector/site (as displayed in Best Ec/Io Server/Sector image) successful. There have been occurrences of locations in the system that do not have any Best Ec/Io Server/Sector being assigned a default value of 100 dBm link. *Note: Comparison of forward and reverse required powers may indicate some disparity in the link budget on the forward and reverse links. Diversity receive antennas may be situated more optimally than singular downtilt antennas creating differences in coverage patterns for forward and reverse links. The height of the transmit vs. the receive antennas can also have an impact on coverage patterns. The engineer should also know the antenna type(s) as well as the tilts on both the Tx and Rx antennas. 2. Interference Images (High FER Areas): A. Worst Interferer: This image will show which site or sector provides the strongest interfering signal. B. Single C/I: This shows the carrier to interferer ratio for the best sector and the strongest interferer. C. Multiple C/I: Multiple C/I images show the carrier to interference ratio between the best sector and all interferers. A negative number means that the sum of the interferers is stronger than the carrier. 3. Delta Images (Difference between the values in any two images of the same type.) A. Best Server/Sector Delta: This compares two Best Server/Sector images. The image shows points that are in image 1 but not image 2, points that are in image 2 but not in image 1, points that have the same value in both images, and points that have different values in the two images. B. Signal Strength Delta: This image compares two signal strength images, either antenna signal strength images (site propagation images) or Best Signal Strength images. Points in the image show the difference between the signal strengths in the images being compared. C. C/I Delta: This will compare two C/I images. Points in the image show the difference between the carrier to interferer ratios of the images being compared.

26

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review 2.5.4 Review of PN Offset Plan

A PN Offset Plan should be reviewed during the design review. A comprehensive discussion on PN Offset Planning and Search Windows can be found in the CDMA RF Planning Guide. 2.5.5 Neighbor List Review An understanding of how neighbor lists were used in the simulation environments should be developed by the optimization team members. For more information on review of neighbor lists, see Chapter 4 (Database Verification).

2.6

Analysis Conducted
The primary analysis centers on understanding any network design issues. These issues may include but not be limited to missing sites, interference, inaccurate or sub-optimal placement of sites by the customer. These issues should be captured in a Problem Resolution Matrix (PRM). The PRM will be used later in the optimization process to track performance issues. Entry of specific problem areas (into the PRM) predicted by simulation work can guide optimization teams so that they do not waste time focusing on known problem areas. The PRM should clearly capture the rationale why there may be sub-optimal performance in various areas of the desired coverage area. An example may be zoning laws, which could restrict the location of a site or the height of an antenna.

Using the images, a PRM can be generated. For more information on the different images in NetPlan, please see the manual RF Engineering Users Manual, manual number 68P09245A02-O. This can be ordered from the TED web site at http://www.cig.mot.com/TED/docs.html. Follow the steps under "How to Order Manuals and CD-ROMs". As a very simple example, perhaps the simulation tool predicted the following, as shown in Figure 2.6-1, based upon the input to the design process:

 No coverage desired Covered area Missing Site Pathloss Interference


27

  

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review

Figure 2.6-1: Simulation Prediction of Coverage Areas Given this scenario, a simplified corresponding PRM may look like Table 2.6.1 below: Tracking Number 1 Description No coverage Cause Problem Date 1/21/99 Resolution Status/Possible Fix(es) Date Waiting for equipment

Site 101 missing 2 No coverage Ridge 1/21/99 Raise antennas /path loss blocking antenna 3 Interference Multiple 1/21/99 Confirm in field and Pilots create dominance 4 No coverage Site 122 1/21/99 Awaiting zoning missing approval 5 Interference Sites 132, 1/21/99 Confirm in field; lower 144 too hot pilots of 132, 144 6 No coverage Sites 145 and 1/21/99 Confirm in field; change /path loss 148 too far or raise antennas, or apart increase pilot power Table 2.6.1: Simplified Problem Resolution Matrix for Simulation Prediction

A more detailed PRM example is included in Appendix 2B. Finally, in addition to the problems captured above, the system design must meet the intent of the contract. A final checklist, as shown below in Figure 2.6-2, should be used to determine if the simulation will produce a system that will meet the contractual commitments.

28

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review

System acceptance Criteria (check if met)


______ ngmob/nmob >95% system mean ______ ngmob/nmob >90% any individual cell/sector Description of Rx Coverage ______ Good ______ Moderate / Poor problems may be due to _______ Terrain/Clutter limitations______ Cell Configuration Description of Simulation Coverage ______ Good ______ Moderate / Poor problems due to: ______ Terrain limitations ______ Multipath pilots ______ ISI ______ High traffic

Description of coverage variations compared to Pathloss only study: _____ More CDMA coverage ______ Less CDMA coverage

Overall assessment ______ Pass-Proceed with implementation ______ Recommended Changes need to incorporated and followed by another Design review Detailed Assessment: (attach documents providing additional detail if required)

____________________________________________________________ ____________________________________________________________ ____________________________________________________________ ____________________________________________________________ ____________________________________________________________ ____________________________________________________________ Figure 2.6-2: Final Checklist for System Acceptance Criteria

29

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review

2.7

Exit Criteria

- Propagation model tuning was adequate. - The target release default parameters were used as inputs to the system design. - Different inputs were used and the different effects were compared. The design verification team reviewed images. - A preliminary neighbor list has been generated and handed off to the person(s) responsible for creating the MIB. - A valid PN Offset Plan has been designed. - An initial PRM has been completed. - The system design meets the customer requirements.

2.8

Recent Developments

Appendix 2A shows results of an auto-optimizer tool that is used in conjunction with NetPlan/CSSS. This auto-optimizer tool focuses on converging to a set of antenna azimuth and tilt angles and BTS SIF pilot powers. This tool has been successfully demonstrated to provide improvements in performance in CDMA systems in Sapporo and Osaka, Japan (both DDI systems), and Curitiba, Brazil (Global Telecom). This utility is currently being tracked as Feature Request (FR number 00019504) for incorporation into NetPlan. [Note: This new auto-optimizer tool is not the same as the previously tested auto-pilot optimizer tool previously investigated. The new tool also has the advantage of included antenna angle optimization as part of its routine.]

30

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review

APPENDIX 2A: New Developments In Simulation Domain Optimization


1. Email from Charles Reisman Outlining the Auto-Optimizer Test Results Subject: KCT and DCT Antenna Downtilt Change Test Results Date: Mon, 02 Nov 1998 18:26:09 +0900 From: Charles Reisman I would like to thank everyone for your cooperation with the antenna downtilt testing which has been performed in Sapporo and in KCT. The Sapporo area (DCT) downtilt testing has been completed, analyzed, and presented to the customers, and the first problem location within the KCT system has also been tested and analyzed. I would like to take this opportunity to summarize the results and also convey the customer reactions (in the case of the DCT testing). Id like to start with the KCT testing. This testing focused on the area between and immediately around sites 89, 114, and 514, where simulations predicted a significant coverage hole (This area was called Problem Location 2 in previous mails.). The area was first driven with the existing tilts in order to establish a baseline, and then 7 sectors antennas were uptilted and the area was redriven. The following table shows a before/after comparison of the KCT results: Type of Data Avg. Forward FER Avg. Best Ec/Io Avg. Aggr. Ec/Io Avg. Mob Tx Power Avg. PMRMs per minute % of Data where Mob Tx Pwr >= 10dBm % of Data where Mob Tx Pwr >= 17dBm % of Data where Fwd FER >= 10% % of Data where Fwd FER >= 20% % of Data where Fwd FER >= 30% % of Data where Best EcIo <= -13dB % of Data where Best EcIo <= -15dB % of Data where Agg EcIo <= -10dB % of Data where Agg EcIo <= -12dB BEFORE 1.2% -7.1dB -5.3dB -18.6dBm 9.2 0.0% 0.0% 1.7% 0.7% 0.5% 0.6% 0.4% 1.0% 0.5% AFTER 1.0% -6.9dB -5.2dB -21.5dBm 8.5 0.0% 0.0% 1.8% 0.4% 0.2% 0.2% 0.1% 0.5% 0.1%

Its especially noteworthy that the average forward FER dropped (i.e. improved) by almost 20%. In addition, the quantile data shows that the percent of area with very high FER (>= 20%, >= 30%) was greatly reduced by the downtilt changes. This should manifest itself to the end users in improved voice quality. Furthermore, the nearly 3dB

31

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review reduction in mobile transmit power will equate to improved battery life, and the reduction in PMRMs will likely translate to slightly reduced average transmit power levels at the base stations. Although not shown in the above table, the results also showed slight reductions in the channel element SHO factor and call processing messaging rates (e.g., PSMMs, EHDMs, HCMs). From this, we can conclude that the customer will also be able to get more out of the infrastructure (i.e., fewer channel elements required and greater MM capacity) as a result of the downtilt changes. In summary, this test was an all out success. We definitely should continue this testing, and I hope that the other predicted problem areas show similar improvements. I am not sure whether or not these results have been presented to DDI and KCT. If they havent been, then it would probably be best to present them soon.

The DCT experimentation showed similar results to the KCT testing, although the baseline drive (which was based on PDC downtilts) showed comparatively better performance than the KCT baseline did. The Sapporo test drive area was significantly larger than that of the KCT testing, and along with specific areas which got better there were also specific areas that got worse (e.g., in terms of forward FER). The following table summarizes the major results from the DCT testing: Type of Data Avg. Forward FER Avg. Best Ec/Io Avg. Aggr. Ec/Io Avg. Mob Tx Power Avg. PMRMs per minute % of Data where Mob Tx Pwr >= 10dBm % of Data where Mob Tx Pwr >= 17dBm % of Data where Fwd FER >= 10% % of Data where Fwd FER >= 20% % of Data where Fwd FER >= 30% % of Data where Best EcIo <= -13dB % of Data where Best EcIo <= -15dB % of Data where Agg EcIo <= -10dB % of Data where Agg EcIo <= -12dB PDC Tilts 2.0% -6.9dB -5.5dB -19.6dBm 10.9 0.5% 0.1% 4.3% 1.6% 0.8% 1.1% 0.3% 1.6% 0.8% Simulated 1.9% -6.8dB -5.4dB -19.6dBm 9.8 0.1% 0.0% 3.6% 1.3% 0.7% 0.6% 0.2% 1.0% 0.4%

This data shows that the downtilt optimization process was very effective in reducing the percentage of area with poor coverage, resulting in net performance improvements overall. And as with the KCT testing, the SHO factor and call processing rates were also reduced, indicating that the customer will be able to get more out of the deployed equipment. I would like to note the following two items in regards to the DCT results:

32

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review 1. In this testing, the XC-related R8.1 PSMM filtering functionality was accidentally left enabled in both test cases (while the MM-related parameters were disabled). This may be the cause of the slightly higher than expected forward FER levels. In any case, since these settings were used for both test cases, it should still be possible to draw conclusions from the above data. 2. Simulations predicted relatively similar performance in terms of basic coverage between the PDC downtilts and the simulated downtilts. The results above confirm this, although its also clear that the simulated tilts perform a drop better. As the system loads, however, the improvement associated with the simulated tilts should become more and more significant. (The main point here applies to all systems.) DDI agreed that the simulated tilts provide a better starting point for system optimization, and Ozaki-san (of DDI) even went as far as saying that it seems to have been the wrong conclusion to arbitrarily want to go forward with PDC downtilts. DCT was also happy to see the positive results today, and requested the new design methodology to be used over a wider area (i.e., to re-simulate portions of the system including more of Sapporo and also Asahikawa). Ozaki-san also mentioned that the preliminary drive testing in TCT where the new design techniques have been used to some degree - is showing excellent results. The results of these two experiments indicate that simulations are fundamentally accurate, and that the new design methodology does indeed produce system designs with improved system performance. They also show that simulations can be a useful tool to identify poorly performing areas and to improve them. Since the DCT downtilts were also determined in part by the downtilt auto-optimizer, the DCT results speak favorably of its performance as well. Going forward I think it would be great if we could apply the new design methodology to all system design work, including retroactively applying it to problem areas in already existing systems such as KCT, QCT, and OCT. It would be possible to set up the autooptimizer, for example, let it run for some period of time, and apply the results. Please let me know if you have any questions or comments. Best Regards, Charles

33

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review 2. Auto Optimizer Tool Users Manual By Charles Reisman The auto optimizer is an automated tool that using predictive methods optimizes any combination of antenna azimuths, downtilts, and/or pilot powers in a CDMA system. Rather than focusing on antenna changes during the system optimization stage, these can be evaluated and optimized during the system design stage, leaving fine-tuning for the system optimization stage. This can yield significant cost savings in deploying systems and can also result in significant reductions in project schedules as well. Furthermore, it is likely that the final system performance will be more optimal, as it is very difficult during the system optimization stage to do anything more than localized parameter/setting optimizations. The only reasonable place for comprehensive changes e.g., widespread antenna downtilting is in the initial system design itself. This tool is especially useful for markets or systems relying on NetPlan RF propagation predictions in which the predictive model is based on accurate elevation and clutter data and has been validated and/or tuned. This paper describes how to set up and use the auto optimizer program. The appendix also provides some degree of explanation regarding the tools theory of operation. The information herein is accurate as of the auto-optimizer version 1.00.

Overview The traditional system design methodology relies on generating baseline plots, making changes, regenerating plots, and evaluating the effects of the changes. This, of course, is an essential process, but many changes to the system result in visually imperceptible changes in performance, and as a result, many such modifications may not be followed up on and implemented even though they may actually provide coverage and/or performance benefits. In addition, many potential improvements may not be obvious to the system designer, and as such these parameter/setting changes may not even be considered and attempted within the traditional system design work. When many such changes are passed over, the result is a system that by design does not perform as well as it could. The design methodology upon which the auto-optimizer is based looks for performance trends in the CDMA simulator output images. Specifically, the auto-optimizer judges good and bad based upon coverage (both forward and reverse links) and pilot Ec/Io predictions. In this sense, it relies on both pathloss and Ec/Io levels in evaluating whether or not a certain change represents an improvement, with pathloss being weighted higher than Ec/Io.

34

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review The user can specify to optimize one or more of azimuths, downtilts, and pilot powers, including in which order to proceed in the case of multiple types of optimizations. In addition, the user can specify which sectors to optimize, whether it be all or a reduced subset. Baseline images are created and bin-counts are tallied at the start of the optimization process. After that, sector settings are changed and tested one by one. The auto-optimizer tool wraps around the NetPlan XLOS and CDMA simulator executables and repeatedly makes such single sector changes, rerunning pathloss calculations (only when performing azimuth or downtilt changes) and CDMA simulations each time on the way to optimizing the system. The tool utilizes bin-counting metrics to determine whether attempted changes are accepted or rejected. If the auto-optimizer accepts a change, the new tallies become the new baseline. The key to the auto-optimizers effectiveness is its ability to decode the NetPlan binary image contents, map the bins from one image with those from the others (e.g., forward power threshold, reverse required power, and best Ec/Io), and map best server sectors to each of the bins - thereby linking each bin in the image to a particular sector. Instead of a best server, one may choose to create and use a natural server image for generally improved results. A natural server image is similar to a best server image, and, in fact, is identical in form, but there is a significant difference in that the natural server image shows the server with the lowest pathloss2 to a particular bin. In the course of operation, the auto-optimizer does not generally reverse changes on itself, but it will reevaluate a sector any time a change made on a different sector appears to affect the sector under consideration. In this sense, if another sector sufficiently changes the RF environment in a particular area it is possible that a previous modification may be reversed. Of course, it is also very possible that a change brings further change a chain reaction. Since the auto-optimizer relies on XLOS RF propagation predictions as well as the CDMA simulator model it is only as accurate as the underlying calculations. Therefore, for maximum accuracy it is recommended to use it in conjunction with highresolution elevation data and high-resolution clutter (land use/cover) data into NetPlan, and to tune the propagation prediction model in one way or another. At the very least, one should verify that the propagation predictions are within the ballpark, and apply some type of error correction if they are not. The auto-optimizer has been tested in three markets as of December 1998: Osaka, Japan (KCT); Sapporo, Japan (DCT); and Curitiba, Brazil (Global Telecom). In addition, the underlying design techniques are being used in other Japan markets as well. In the Japan markets, only downtilts have been modified to date. In Brazil, changes have been to both antenna azimuths and downtilts. In each market, before and after drive tests were performed and analyzed, and the system performance was found to improve as a result of
2

This is not necessarily entirely true in practice, but the reader is encouraged to conceptualize natural server images as such.

35

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review the recommended changes. It is noteworthy that the KCT system had undergone a full system optimization process already, and the new design methodology lead to problem fixes in areas that the system optimization team did not even realize could be improved. Lastly, it should also be noted that the auto-optimizer is a work in progress. The algorithms used within the tool are known to be effective, but it is not yet clear whether they are as optimal as they could be. If you should have any comments, please send them by Email to Charles Reisman (A10510@email.mot.com).

Requirements In order to use the auto-optimizer, it is necessary to have a computer with NetPlan 3.1 (either Sun or HP) with the CDMA Simulator option and Perl version 5.0 or later. NetPlan 3.2 may also work, but it has not been tested yet.3 Several sub-utilities and general purpose Unix utilities are also required. These include: Calc_Bins (includes calls to sub-executables image_values, hist_values and Calc_Max_RevPwr) POLYFIL compress_all Standard NetPlan sub-executables: glosB, compimages, grid_comb, and cdma Standard Unix programs, such as cp and cmp When you launch the auto-optimizer, it will confirm that each of the above programs (except POLYFIL) is available, and the run will fail if any is not found.

Setup To set up the auto-optimizer the following steps should be followed: Set up a NetPlan analysis. For speed and efficiency during the auto-optimizer execution, it is recommended to customize the sites propagation boundary using the NP_to_CSSS tool. Since the autooptimizer repeatedly calls for XLOS and CDMA simulator calculations to be performed, and these calculations take much time, it is desirable to minimize the calculation area. The NP_to_CSSS tool requires the user to specify a calculation frame and a maximum propagation distance. The frame setting specifies a window size around the cell sites in the analysis, such that if the frame is set to x meters, calculations will not be performed beyond x meters west of the westernmost site, north of the northernmost site, east of the easternmost site, and south of the southernmost site. The maximum propagation distance setting serves as a maximum value in case the calculated distances (i.e., to the frame)
3

Whatever the case, the auto-optimizer will be made to work with NetPlan 3.2.

36

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review would otherwise exceed it. For example, if the distances from a site to the western edge of the frame were 100km, and the maximum propagation distance were set to 48km, the sites propagation boundary in the western direction would be set to 48km. A calculated distance of 30km, for example, would be left as is since it is less than the maximum propagation distance setting. The NP_to_CSSS utility creates an SQL script that should be run directly on the NetPlan Informix database. An alternate method to limiting the image calculation space is to customize the combined image propagation boundaries within NetPlan. This will save time on the CDMA calculations, but it will not minimize the time required for XLOS pathloss calculations. Modify the antenna pattern(s) and make a natural server image. This is essentially the same process as when creating a best Ec/Io server/sector image, except before running the CDMA simulations, one should either: Create a special antenna pattern or patterns with the vertical pattern data points all set to 0dB. (recommended) Set all antenna downtilts to 0 degrees. This particular simulation should be performed with very favorable conditions e.g., unloaded, no vehicle/building penetration loss, etc. as the only purpose of the resulting image is to tie each bin to a best server, and as many bins should be tied to best servers as possible (i.e., it is undesirable to have uncovered bins). It does not serve in any way to show the effects of loading on performance. Only one Monte Carlo drop is required. The resulting EcIoServer_1 image (which can be found in the CDMA_DROP/EcIoServer directory below the NetPlan analysis main directory) must then be copied into a new directory called NaturalEcIoServer, which must be created below the analysis main directory. Apply a polygon filter to the natural server image. Create a polygon, which defines the outer edge of the area, which youd like to consider. In general, this polygon serves to filter out areas, which are not within the intended coverage area, such that terrain and/or topology do not adversely affect simulation results. For example, consider a flat urban area, which is immediately surrounded by mountains. It is very likely that the mountainsides will not show good coverage, and therefore, the simulations would consider those as areas of potential improvement. Downtilt optimization were that the optimization mode in question - might therefore lead to a significant degree of uptilting if not otherwise controlled, yielding improved mountainside coverage, but at the expense of the more important area below. The polygon filter is needed in order to minimize such undesired changes. To apply the polygon filter, first create a polygon or polygons using the NetPlan polygon tool, and then create a traffic distribution map using the Create Map from Polygons option with an arbitrary non-zero Erlang levels for the polygon(s). Then, copy the created traffic distribution map, which will be located somewhere under the traffic_map directory within the analysis main directory, into the NaturalEcIoServer directory. Next, from the NaturalEcIoServer directory apply the polygon filter as follows: POLYFIL EcIoServer_1 Polygon_Map Substituting the appropriate name for the polygon traffic map as necessary. Next, backup the original EcIoServer_1 file, which will not have been touched by the above operation.

37

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review Lastly, rename the filtered image file, which will be called EcIoServer_1.FIL, to EcIoServer_1. The natural server image is now ready for use. Restore the actual antenna patterns if changed above, and rerun all pathloss calculations such that they are up to date. Start up a new NetPlan session from the shell4 after entering the following command (assuming K-Shell is the shell in use): export AIM_DEBUG=3 Open up the analysis, and confirm once again that step 5 above has been executed. Then, using the Edit->Antenna function, arbitrarily change the antenna downtilt for one and only one sector. Press the Update button to have the change registered in the database. Then, return the changed downtilt parameter back to its original setting, and press the Update button once again. Then launch site propagation calculations by choosing the Images->Create option. Click on Site Propagation, and then choose the site whose downtilt was changed and add the CDMA Antenna Gain to be calculated and press OK. Be sure not to include any of the analog/TDMA transmit/receive calculation options. Various debug messages will appear in the AIM window. Take special note of the /tmp directory (e.g., /tmp/BD12a345678) in which these AIM files are located. This is the directory, which will be entered as the NP_GLOS_DIR in the main auto-optimizer configuration file (see below). Set up the CDMA simulation parameters for one Monte Carlo drop (for both statistics and images) and for the various other design parameters. Unlike the previous simulation run in which the natural server image was created, this time the simulation should be set up to reflect the actual design conditions which should be considerably more conservative. The number of mobiles should be set to a moderately heavy load (in constant mode) and the other link budget parameters e.g., mobile antenna gains, fading margin, etc. should likewise be set conservatively. In fact, at the system designers discretion, these settings might be set even more conservatively than the actual design calls for. In this manner, it might be possible to guard against degradation to in-building coverage, for example, for a system that was otherwise designed only for in-vehicle loss. Without making any further database changes, once again select the Images->Create option, but this time select the CDMA Simulator. Then, select the following image/data types and press OK: Reverse Required Power Best Ec/Io Best Ec/Io Server/Sector Maximum TCH Threshold Exceeded Cell and Mobile Statistics Once again take note of the /tmp directory name which appears in the window. This will be the name that is entered as the NP_CDMA_DIR variable setting in the main configuration file. Using the npadmin utility, export the Informix database data associated with the analysis under test. It is okay if the exported data includes more sites or site versions than those in
4

The exact command will depend on the installation, but may be as simple as NetPlan.

38

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review the analysis. The path for the unload (unl) directory will become the setting for the NP_UNLOAD_DIR configuration file entry. Set up the main auto-optimizer configuration file. The file is comprised of various token definitions one per line, in which all of the major settings and execution options are specified. Please refer to the next section for more detail on the various tokens. Execute the auto-optimizer with the following command: Auto_Optimizer C Auto_Optimizer.cfg substituting the name of the main configuration file, as appropriate. The auto-optimizer must be launched from the NetPlan analysis main directory, regardless of where the configuration file and miscellaneous input and output directories are located.

The Main Configuration File In order to use the auto-optimizer, one must set up a configuration file that contains the necessary settings and references to other files and directories. The following is a sample configuration file: #Sample Auto_Optimizer Configuration File: ADJUSTABLE_SECTOR_FILE = Optimizer/adjustable_sectors ANT_INFO_FILE = Optimizer/antenna_info ARCHIVE_DIR = Optimizer/Opt_Archive AZIMUTH_INCREMENT = 5 AZ_ADJ_SECTOR_FILE = CALC_BINS_CONFIG = Optimizer/Calc_Bins.cfg DT_ADJ_STRATEGY = B-U EMAXX_MODE = YES INDEX_SECTOR_COUNT = 5 LIMIT_BINS = YES MAX_AZIMUTH_SHIFT = 180 MAX_PILOT_POWER = 6 MIN_PILOT_POWER = 0.75 NONADJUSTABLE_SECTOR_FILE = NP_CDMA_DIR = /tmp/BD56c501239 NP_GLOS_DIR = /tmp/BD25c429269 NP_SITE_VERSION = 2 NP_SYSTEM_NAME = DCT_CDMA NP_UNLOAD_DIR = Optimizer/export/unl OPTIMIZATION_MODE = DP OUTPUT_LOG_FILE = Optimizer/Auto_Opt.out PILOT_ADJ_FACTOR = 1.26 PP_ADJ_SECTOR_FILE = Optimizer/adjustable_sectors PRIMARY_REGION = A PRIORITY_SECTOR_FILE =

39

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review RESULT_TABLE_FILE = Optimizer/Result_Table RUN_INIT_SIMULATION = YES TIME_LIMIT = USE_NAT_SERVER = YES Any line beginning with a # is taken as a comment and is ignored. Any white space characters are also ignored. The token name falls to the left of the equals (=) sign, and its value is to the right. In some cases when a certain setting is not applicable, it is all right for the value to be left null. The various configuration file tokens carry the following meanings and usages: ADJUSTABLE_SECTOR_FILE Defines the file that specifies which sectors downtilts may be adjusted by the autooptimizer. If not set, then the auto-optimizer assumes that all sectors downtilts may be adjusted (barring entry in the NONADJUSTABLE_SECTOR_FILE). The file contains one sector per row, represented by two columns with the site number in the first column, and sector number in the second. The two columns should be separated by a white-space character (e.g., a tab). Optionally, the user may also include a third column, which defines the region code with which the particular sector will be associated. If used, this code will tie the site to a Calc_Bins calculation region. If the third field is not present, the sector will be associated with the PRIMARY_REGION. When using a polygon-filtered natural server image, the region field is generally left unassigned, with the polygon filter serving to eliminate unwanted areas from the calculations. ANT_INFO_FILE Specifies the name of the antenna information file, a required file that defines the characteristics of the antennas in use in the analysis. This file is described in greater detail later in this document. ARCHIVE_DIR Specifies the directory in which the individual trial data directories are archived. This directory must be empty at the start of the auto-optimizer run. AZIMUTH_INCREMENT Defines the increment by which antenna azimuths are changed. For example, if this value is set to 10, then antenna bore angles will be adjusted clockwise or counterclockwise in 10-degree steps. AZ_ADJ_SECTOR_FILE This is the equivalent of the ADJUSTABLE_SECTOR_FILE setting, but it applies to antenna azimuth adjustments. Unlike the case of downtilt adjustments, however, sectors must be defined in this file in order to be adjustable (as is the case with PP_ADJ_SECTOR_FILE). CALC_BINS_CONFIG Defines the Calc_Bins utility configuration file. The Calc_Bins utility is documented separately. When using the auto-optimizer with a polygon-filtered natural server image, it is generally preferable to use a default Calc_Bins configuration file that contains nothing, except for one line with the PRIMARY_REGION code letter.

40

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review DT_ADJ_STRATEGY This setting defines how the auto-optimizer should approach antenna downtilt changes. There are six options defined as follows: D Specifies to perform downtilts only. U Specifies to perform uptilts only. B-D Attempts both uptilts and downtilts in an alternating fashion, but will first attempt a downtilt. This setting is essentially the same as B-U, except for the very first trial. B-U Attempts both uptilts and downtilts in an alternating fashion, but will first attempt an uptilt. This setting is essentially the same as B-D, except for the very first trial. D-U Will perform only downtilts to start out with, and when it finds it cannot make any more adjustments will also attempt uptilts. After uptilt changes begin, further downtilts are still possible, as changes can trigger additional changes. U-D Will perform only uptilts to start out with, and when it finds it cannot make any more adjustments will also attempt downtilts. After downtilt changes begin, further uptilts are still possible, as changes can trigger additional changes. EMAXX_MODE Should be set to YES when simulating with the EMAXX chipset and to NO otherwise. When set to YES the auto-optimizer sets up the CDMA_EMAXX=1 environment variable for the user. INDEX_SECTOR_COUNT Specifies how frequently the auto-optimizer should retest sectors that have been successful thus far and for which further testing remains. Please refer to the Theory of Operation section to get a better understanding of what this means, but when set to an arbitrarily high value, sector retesting is essentially disabled until all sectors have been tested at least once; and when set to a low value (e.g., approaching 1), will give a strong emphasis to those sectors which it has already tested. The default setting is 5, which gives a strong emphasis to untested sectors, but will at the same time reasonably frequently retest a sector which shows significant potential for further improvement. LIMIT_BINS When set to YES causes any bins that do not have an associated natural server/sector to be ignored. In general, this value should be set to YES whenever a polygon-filtered natural server image is used, and to NO otherwise. MAX_AZIMUTH_SHIFT Defines the maximum azimuth (bore angle) shift in degrees that any sector may be adjusted as part of azimuth auto-optimization. A setting of 40 degrees, for example, would mean that a sector initially with an azimuth of 60 degrees must end up somewhere between 20 degrees and 100 degrees (inclusive). The default value is 180 degrees. MAX_PILOT_POWER Sets the maximum pilot power (in Watts) to which any sector may be increased. The default value is 6W. MIN_PILOT_POWER Sets the minimum pilot power (in Watts) to which any sector may be reduced. The default value is 0.75W. NONADJUSTABLE_SECTOR_FILE

41

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review This optional file complements the ADJUSTABLE_SECTOR_FILE by specifying sectors that may not be adjusted for downtilt trials. This provides an extra degree of flexibility in setting up the auto-optimization data. NP_CDMA_DIR Defines the NetPlan AIM temporary directory in which the CDMA simulator template exists. This setting is described in greater detail above in the Setup section. NP_GLOS_DIR Defines the NetPlan AIM temporary directory in which the GLOS template exists. This setting is described in greater detail above in the Setup section. NP_SITE_VERSION Specifies the site version for all of the sites in the analysis. All sites must have the same version number. This setting defaults to 1. NP_SYSTEM_NAME This setting defines the system name in the NetPlan database under which the switch, sites, and sectors exist. NP_UNLOAD_DIR Specifies the path for the exported Informix database unloaded data directory. The value must include the unl directory itself. The exported data must represent up-to-date database settings as of the start of the auto-optimizer execution. OPTIMIZATION_MODE Specifies what the type or types of auto-optimization and the order of the processes. This setting will consist of one or more of the following in a single string: A Azimuth auto-optimization D Downtilt auto-optimization P Pilot power auto-optimization A setting of D would specify that only downtilt auto-optimization will be performed. A value of DP would specify for downtilt auto-optimization to be performed followed by pilot power auto-optimization. A setting of PD could be used to perform the same auto-optimizations, but in opposite order. OUTPUT_LOG_FILE Specifies the file in which the auto-optimizer output is logged. Generally speaking, the auto-optimizer outputs miscellaneous information to both the log file as well as the Unix standard error file handle (which will normally be output to the terminal). PILOT_ADJ_FACTOR Specifies the linear multiplicative step-wise increase/decrease factor for pilot power adjustments. A value of 1.26, for example, is equivalent to specifying that pilot power adjustments be made in steps of +/- 1dB in log scale (10 * log 1.26). PP_ADJ_SECTOR_FILE This is the equivalent of the ADJUSTABLE_SECTOR_FILE setting, but it applies to pilot power adjustments. Unlike the case of downtilt adjustments, however, sectors must be defined in this file in order to be adjustable (as is the case with AZ_ADJ_SECTOR_FILE). PRIMARY_REGION Defines the region to be associated with any adjustable sector that does not specifically have a region defined for it in the adjustable sector file (see the ADJUSTABLE_SECTOR_FILE description). When using a polygon-filtered natural

42

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review server image, this should be set to match the single region code which is entered in the Calc_Bins configuration file (see CALC_BINS_CONFIG). PRIORITY_SECTOR_FILE This optional file specifies priority sectors for downtilt adjustments. The auto-optimizer will complete downtilt trials for all priority sectors before moving on to the remaining adjustable sectors. The priority sector list should be a subset of the adjustable sector list. RESULT_TABLE_FILE Defines the file in which the auto-optimizer results data is summarized in tabular form. RUN_INIT_SIMULATION This setting determines whether or not the auto-optimizer executes a CDMA simulation to create baseline data. If set to YES a simulation is run before calculating the bin totals. If set to NO, the auto-optimizer assumes that a simulation has already been run whose results are still valid. This variable defaults to YES, but can be set to NO when valid simulation data already exists to save a little time. TIME_LIMIT The user may optionally specify a time limit expressed in hours. The auto-optimizer will begin to test modifications and if it is still running when the time limit expires, it will finish the current trial and then end, cleaning up after itself restoring the original settings (as it normally does upon completion). USE_NAT_SERVER Specifies whether or not a natural server image is to be used. This should be set to either YES or NO and will default to NO. If a natural server image is used, the file must exist at NaturalEcIoServer/EcIoServer_1 under the analysis home directory. If set to NO the Ec/Io best server/sector from each respective trial will be used to map each bin to a server/sector. The Antenna Information File The following is an example antenna information file: #Antenna Definitions File Antenna Model Definitions: #AntModel AntClass ElecDT md8cr6xs8-u 2md8cr6xs8 u2 md8cr6xs8-u 1md8cr6xs8 u1 md8cr6xs8-0 md8cr6xs8 0 md8cr6xs8-1 md8cr6xs8 1 md8cr6xs8-2 md8cr6xs8 2 md8cr6xs8-3 md8cr6xs8 3 md8cr6xs8-4 md8cr6xs8 4 md8cr6xs8-5 md8cr6xs8 5 md8cr6xs8-6 md8cr6xs8 6 md8cr6xs8-7 md8cr6xs8 7 md8cr6xs8-8 md8cr6xs8 8 md8cr6xs8-9 md8cr6xs8 9

43

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review md8cr6xs8-10 md8cr6xs8-11 md8cr6xs8-12 m8-cr9-800f md8cr6xs8 md8cr6xs8 md8cr6xs8 m8-cr9-800f 10 11 12

Antenna Class Definitions: #AntClass E/M Gain HORIZ VERT MIN_DT MAX_DT DEPTH PREC m8-cr9-800f M 17.0 60 3 u2 6 0.200 1 md8cr6xs8 E 16.0 60 4 u2 12 0.180 1 Models to Avoid: md8cr6xs8-1 0 md8cr6xs8-3 0 md8cr6xs8-11 0 md8cr6xs8-12 0 This file is divided into three parts, which are identified by the three key phrases Antenna Model Definitions:, Antenna Class Definitions:, and Models to Avoid:. The individual antenna models are defined in the section of the file denoted by the Antenna Model Definitions: key phrase. Each antenna model which exists in the exported NetPlan database must be specifically entered in this section. The first column shall contain the model name verbatim, the second name shall contain the assigned antenna class (re: the Antenna Class Definitions: section), and the third column shall contain the downtilt value for electrically downtilted antennas. The third column may be left blank for mechanically downtilted antennas. Its important to note the distinction between mechanically and electrically downtilted antennas from the auto-optimizers point of view. For the purposes of this tool, a mechanically downtilted antenna is one for which a downtilt is applied by changing the NetPlan downtilt parameter. An electrically downtilted antenna is one in which the antenna model itself is changed when changing the antenna tilt. For this reason, it is necessary to clearly define the downtilts for electrically downtilted antennas. In addition, all antennas in the same family of electrically downtilted antennas must be defined with the same antenna class. This is how the auto-optimizer links the antennas as a set. Likewise, antenna models that should not be linked must be assigned different antenna classes. Antenna classes are defined in the Antenna Class Definitions: section. It is here that the user specifies whether an antenna is electrically or mechanically downtilted and what the minimum and maximum applicable downtilts are. The precision column specifies the resolution (in degrees) for downtilt changes. A value of 1, for example, indicates that downtilt adjustments are made on a one degree by one-degree basis, pending antenna model definitions in the case of electrically downtilted antennas.

44

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review The remaining columns in the antenna class definition section are required as placeholders. The entered information is not used, however, and therefore is not significant.5 The final section of the file, Models to Avoid, is optional, applies to downtilt adjustments, and defines any downtilt settings which should not be tested by the autooptimizer. This is useful if one knows that the antenna pattern differences between two downtilts are insignificant, or if one wants to reduce antenna inventory by limiting the number of downtilts in use in the system (in the case of electrically downtilted antennas). Even if the precision setting would otherwise specify to test a certain antenna downtilt, it will be skipped if a model or downtilt is entered here. The Output Files The auto-optimizer generates the following three types of output files: Output log Results summary table Trial archives The output log file is specified by the OUTPUT_LOG_FILE configuration file token. This file contains miscellaneous messages from and information regarding the autooptimization process. For a normal execution of the auto-optimizer, the first part of the file contains information related to the initialization process. Next comes messages associated with each of the trials, including specific numerical results information. Last comes messages associated with the cleanup process, if applicable. The single most valuable use of this file is to confirm that the auto-optimizer both executes and terminates properly. After the auto-optimizer run completes, the user should search for the strings Warning and Error in the output log. If occurrences are found, the cause will need to be identified, additional cleanup may be necessary, and the auto-optimization run will need to be repeated. Otherwise, if none are found, the user may go on to examine the other results. Among the various auto-optimizer outputs, the RESULT_TABLE_FILE contains the most useful information in a very concise format. The following is a sample file: Trial S-S 1 124-3 2 137-1 3 126-5 4 1-5 5 126-6 6 146-1 7 126-5
5

Msc From Antenna To Antenna S/F Row Change Comp UUA md8cr6xs8-3, 0.0 md8cr6xs8-2, 0.0 S 23 1/91 0 UDA md8x6u212-7, 0.0 md8x6u212-8, 0.0 U 20 -1/12 0 UUA md8cr6xs8-8, 0.0 md8cr6xs8-7, 0.0 F 23 -3/-123 1 UDA md8prx006-3, 0.0 md8prx006-4, 0.0 S 11 2/28 1 UUA md8cr6xs8-10, 0.0 md8cr6xs8-9, 0.0 F 23 -1/-51 2 UDA md8cr6xs8-4, 0.0 md8cr6xs8-5, 0.0 S 11 1/104 2 IDA md8cr6xs8-8, 0.0 md8cr6xs8-9, 0.0 S 23 1/63 1

Users of the NP_to_CSSS tool may recognize the similarity between the antenna info files for the two tools.

45

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review 8 111-6 UUA m8-cr9-800f, 3.0 m8-cr9-800f, 2.0 F 11 -1/-45 2

The first column shows the trial number. The second column, labeled S-S contains the site and sector that was tested. The next column contains a three-letter code, which can be decoded according to the following table: Optimization Mode First Character Second Character Third Character Azimuth Untested/Index Azimuth Region Code Downtilt Untested/Index Uptilt/Downtilt Region Code Pilot Power Untested/Index Pilot Power Region Code The next two columns in the table show the "from-state" for the sector under test and the "to-state" for the sector. The type of information shown will depend upon the type of optimization i.e., whether the azimuth, downtilt, or pilot power is being tested. The sixth column shows the result of the test; This will always be either a S for success, F for failure, or U for unclear. On successes, the change is accepted and the new state and results become the new baseline for ensuing tests. The opposite is true of failures; the result is rejected and the system is reverted to the previous baseline. Unclear results are treated similarly to failures in that the system is reverted to the previous baseline, but there is a difference in that the sector is kept, to be re-tested at a later time. The seventh column of the results table shows the row in which the decision was made 6. This is a code for the condition by which the test was judged to be a success or failure and can be interpreted via the following table: Row Number 5 8 11 14 17 20 23 Condition Forward Link Coverage Reverse Link Coverage Coverage (Both Fwd. and Rev. Links) Coverage and Ec/Io >= -10dB Coverage and Ec/Io >= -12dB Coverage and Ec/Io >= -13dB Coverage and Ec/Io >= -14dB

In cases where no performance change is found through the sector change under test, this column will not contain a number, but rather will contain -. The eighth column of this file shows a summary of the numerical comparison results. The first of the two numbers shown shows the change in the decision row (of the previous column) at the decision power (please refer to the Theory of Operation section for more detail). A positive number indicates that the change results in improved performance for the condition indicated by the previous column (corresponding to the power level, which can be found in the output log file). A negative number likewise
6

This information comes directly from the Image.stats-* files which are logged in the archive directories. Row numbers correspond to row numbers in those files.

46

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review indicates degradation. The second number in the pair represents the overall change, and is a tally of each of the individual conditions changes for each of the power levels under consideration. Again a positive number indicates an overall improvement, and a negative number represents overall performance degradation. The final column of this file shows a count of the number of sector-directions that have been tested and completed. This is an approximation, as sectors that are once considered to be completed may need to be re-tested due to a chain-reaction effect. In the case of azimuth optimization, pilot power optimization, or downtilt optimization in which both uptilts and downtilts are to be performed, the completed sector count will go up to twice the number of sectors under test. For downtilt optimization in which only one of either uptilts and downtilts is performed, this count will come to equal the number of sectors under test. Various files are stored in each of the trial sub-directories within the archive directory. Many of these files are direct outputs of the Calc_Bins utility (such as the Image.stats-* files mentioned in the earlier footnote), but the most useful file and the only one to be described here is the output of the auto-optimizer. Only for trials which were successes, there will be a change recommendations file called either AZ_Recommendations, DT_Recommendations, or PP_Recommendations, as appropriate which contains a listing of all change recommendations for all testing (for the type of optimization currently under consideration) up until then. Therefore, when the autooptimization run completes, the results table file should be checked to see what the last successful trial was for each optimization type, and then the archived trial directory or directories should be consulted to see the list of recommended changes. Stopping Execution Midway Through an Auto-Optimization Run There may be occasions in which the auto-optimizer must be stopped before it completes its operation naturally. Since the tool works directly on the NetPlan XLOS files in the case of azimuth and downtilt optimization and on the CDMA simulator template file in the case of pilot power optimization, the auto-optimizer must clean up after itself to restore all critical files to their original states before terminating operation. It is therefore extremely dangerous to terminate the auto-optimizer prematurely with a control-C. Rather, there is a mechanism built into the auto-optimization tool for this very purpose. After each test trial, the program pauses for 20 seconds during which time the program waits to receive a QUIT signal. If it does not receive this signal, the auto-optimization continues therefore. If a signal is received, the auto-optimizer recognizes that it needs to terminate the run and begins to clean up after itself. The signal is sent to the process with the following command: kill 3 PID from a different shell, where PID should be replaced with the auto-optimizers process ID (which can be determined with the help of the ps command). In the case of pilot power

47

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review optimization the cleanup process will be very quick, but for azimuth and downtilt optimization runs, the cleanup process may take some time, depending on the number of changes accepted thus far in the run.

Theory of Operation Whether the optimization mode be azimuth, downtilt, or pilot power, the auto-optimizer tests and changes a single sectors setting one notch at a time. The results of each test are compared to the current baseline and the change is accepted if the results are judged to improve. In such a case, the baseline is revised appropriately to reflect the new information. If the test is deemed a failure, the setting is reverted to the previous setting and the previous baseline is maintained. Whenever a test results in a failure, that change direction is marked internally as being completed. Furthermore, if a test is successful, the opposite direction is marked as being completed. Any sector direction which is marked as being completed will not be retested, unless it should become unmarked. Any marked sector will be unmarked if a test for a sector is successful, and the resulting bin counts show that the conditions for the sector have changed. This is determined by counting the number of bins associated with the marked sector as their best server at maximum power and within 3dB of that maximum power (also noting that maximum power level). Should any of these values change, the sector will be unmarked, thereby enabling it for further testing. As the auto-optimizer runs, it keeps track of those sectors that have been tested already and those which have yet to be tried. For those that have been tested it maintains an index value which is a function of the results associated with that sectors last trial. More specifically, an index is maintained for each change direction. For example, in the case of downtilt optimization, a separate index will eventually exist for each sector for both uptilts and downtilts. The greater the value, the more promise for further changes in that direction for that sector. In addition, when a sector is first tested in either direction, the index is entered not only for that direction, but the negative of the index is entered for the opposite change direction. Therefore, if a test yielded excellent results, this is reflected in the index for that change direction for that sector so as to indicate that further change in that direction has the strong potential to yield further performance improvement. Likewise, the index for the opposite direction will be assigned a highly negative value to indicate that there is little potential for such a test. The auto-optimizer will tend to concentrate on untested sectors when it first starts out, though this behavior can be modified with the INDEX_SECTOR_COUNT configuration parameter. The auto-optimizer will begin by testing untested sectors. After INDEX_SECTOR_COUNT instances of either successes or outright failures on previously untested sectors, it will then test a previously tested sector, if one exists. It chooses which sector to test and in which direction based upon the index value, with the sector direction with the highest index value being selected. This pattern will continue

48

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review until no untested sectors remain, after which testing will continue based only upon the index table. Testing will then continue until no uncompleted sector directions remain. Trials are accepted as successful whenever the two results values that are logged in the results table file are both positive or if the first is positive and the second is zero (the opposite case is not possible). A trial results in failure if both of those values are negative, or if the first is negative, and the second is zero. If one value is positive and the other negative, the result will be unclear (similar to a failure, except the sector is not marked as being completed) if there are untested sectors remaining, or a failure if no untested sectors remain. In this sense, the auto-optimizer is more forgiving early on in its operation, but then gets stricter as It approaches its completion. From this, it can also be seen that the auto-optimizer places emphasis on improving areas with weak coverage. It will only accept changes in which overall performance is improved while the amount of weakly covered area is likewise reduced. The testing order for untested sectors and sector directions is initially selected based upon looking at the maximum mobile transmit powers associated with the natural server sectors. For uptilt and azimuth optimization, the as-of-yet untested sector with the greatest maximum mobile transmit power is selected. For downtilt and pilot power optimization, the untested sector with the lowest maximum mobile transmit power is selected. In this manner, downtilt and pilot power optimization will tend to move outwards from urban areas to rural areas, and uptilt and azimuth optimization will tend to proceed in the opposite direction. Algorithmically speaking, because testing for any sector will end as soon as a failure is encountered and because a change direction will not be tested if a change in the opposite direction was already found to result in improvement, the auto-optimizer may not find ideal settings if it were to become trapped in a local maximum point. The extent to which such local maxima exist is unclear. In any case, because of this aspect of its operation, the effectiveness of the tool may to some degree be a function of the set of starting parameters. It is hoped that retesting sectors when necessary will serve to minimize this effect. As a final note, the importance of applying a polygon filter to the natural server image and of using other techniques to improve execution speed cannot be underestimated. In particular, the use of a polygon filter will not only speed up the post-processing of the simulation data, it will also significantly increase the accuracy of the results. Any area which is not to be covered by the sites in the simulation e.g., mountainsides, excessive distances, oceans, etc. should be filtered out so that the auto-optimizer can concentrate on maximizing performance in those areas which the system is meant to cover.

49

Optimization Procedures and Guidelines, Ver 1.2 Network Design Verification/Review

APPENDIX 2B: Sample Problem Resolution Matrix (PRM)


This PRM shows various examples of problems encountered in the field. Legend: Rx: Ec/Io: FFER: Tx: RFER:
Date

marginal = * poor = -80dBm < Rx < < -90 90dBm dBm > -12 dB -12dB < Ec/Io < -15dB < -15 dB < 3%, 3% < FFER < 5% > 5% < +17dBm >+17dBm < 3%, 3% < RFER < 5% > 5%
Physical Performance Sectors Cluster/B Rx Ec/Io FFER Tx TS (dbm) % 2-4; 531; 32-3 South BTS 2; North BTS 53; East BTS 32 * * + Problem Description (#FTOs, Drops, etc.) Status, Action Plan Closed Date

good = + > -80dBm

Problem Identifier B1

3/13/98

FTOs and drops; marginal Ec/Io and Rx; no dominant server in area

Redrive area after uptilt done on open 32-3 and 2-4; probably will create set of dominant servers

3/13/98

B1

42-4; 331; 33-2; 32-5; 326

South BTS 42; Northeas t BTS 33; West BTS 32

FTOs and drops; poor readings for Ec/Io, Fwd. FER and Tx pwr; may be due to BTS 74 off air

Redrive after fixing BBXs on BTS 74 to determine if still problems in this area. Most of these pilots are weak, and BTS 74 should be dominant server when working.

open

3/13/98

B1 & B2

All 77 sectors

3/13/98

B1 & B3 33-4, 33- "Ring of 5; 74-all; Fire" 69-all; Cluster 29-1; 292; 29-6; 30-3; 304; 30-5

Redrive after fixing CSMs on BTS 77 to determine if still problems in this area. Area should see better Ec/Io when it can use BTS 77 for soft(er) handoffs. + Multiple Pilots serving area. Increase downtilts on 29-2 and 29-6 that are overshooting into this area. Reduce SIF pilot powers on 30-3, 30-4, 30-5 (already at 9 degrees downtilt).

Off air due to CSMs affected border between B1 & B2; specifically North BTS 77; East BTS 46; Southwest BTS 70

open

open

3/13/98

B3

All BTS 61 sectors

Coverage boundary cell going into mountains. Coverage spotty.

Recommend repointing antennas open along major roads to minimize impact to users.

50

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test

3.0 Equipment Installation and Test


3.1 Description

This activity encompasses verification that the installation and test of new CDMA BTS equipment at site locations was properly performed. By working with the CFE crews to evaluate the BTS ATP (Acceptance Test Procedure) data, the RF system optimization team will ensure the cell sites are ready to start Single Cell Functional Test (Chapter 7). The lead System Engineer should be aware of any site-specific issues that may delay the availability of a site. The BTS Optimization/ATP process encompasses the tasks of installing the BTS software, performing the cell site ATP, optimizing per the ATP and verifying the functionality of the site to be sure the cell is properly operating (e.g. making calls on all MCC channel elements). The BTS Optimization/ATP is a sanity check that verifies the hardware is doing what it is advertised to do. Going beyond the BTS Optimization/ATP procedure, it is critical that the cell site be fully integrated across the span lines to the CBSC. Several markets have developed Integration Test Checklists to follow to ensure that this work is performed properly. Appendices to this chapter contain a sample checklist that can be used to verify that this BTS/CBSC integration activity is complete. Naturally, modifications to this checklist are appropriate for various equipment configurations, and should be implemented as necessary. [Note: In practice, this type of formal review is not typically conducted. Usually, notification is provided to the system optimization teams from the CFE teams that a cluster is ready for optimization. However, this chapter gives the RF optimization engineer the necessary guidelines and information to work with the CFEs to isolate any problems that may have been overlooked. As an alternative, the CFE manager could do this quality review and sign off on the turnover of BTSs to the system engineering team prior to start of SCFT.]

51

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test

Optimization Preparation
Equipment Installation and Test Verification (Chapter 3)

Network Design Verification (Chapter 2)


Accurate Terrain, Clutter, Xlos Tuning Data System Design via NetPlan/CSSS

RF Parameters Database Verification (Chapter 4)

Spectrum Clearing (Chapter5)

Data Collection and Analysis Tools Selection, Install, Test (Chapter 6)

Network Optimization Single Cell Functional Test (Chapter 7) Initial Coverage Test (Chapter 8) System Optimization and Detailed Problem Resolution (Chapter 9) Final Coverage Survey and Warranty Testing (Chapter 10)

System Operations (Chapter 11)

Commercial Service: Network Performance Monitoring and Expansion (Chapter 11)

Figure 3.1-1: Relationship of Equipment Installation Verification to Entire Optimization Process

3.2

Tools Required

No specific tools are required for this task. However, to enable this verification, the system engineer should understand the BTS Optimization/ATP procedure, BTS/CBSC integration procedures, and be capable of reviewing these data sheets with the CFE. BTS Optimization/ATP procedure manuals for various BTS models can be ordered from TED at: http://www.cig.mot.com/TED/docs.html. These manuals list additional documents and tools required to perform optimization of the cell site equipment. [An LMF or Local Maintenance Facility will be used to conduct many of the ATP tests. The LMF Basics are covered in the "LMF Operators Guide, Local Maintenance Facility", order manual # 68P09226A13-B from TED at http://www.cig.mot.com/TED/docs.html.]

3.3

Personnel Required

52

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test Type Skill Level RF System Engineer White Belt (See Appendix A) Cellular Field Engineer (CFE) See Appendix A Table 3.3-1: Personnel Required

3.4

Entrance Criteria

1. The system engineer should verify that the BTS optimization/ATP has been completed for each BTS being reviewed, and that the datasheets/ATP reports from each BTS are available. This can be for an entire cluster at a time for maximum efficiency. The system engineer can review the data sheets even if they have a master sign-off by the CFE manager. 2. Any additional data sheets from other test activities, such as Integration Test Procedures, are available. ITPs traditionally cover tests above and beyond the normal Opto/ATP procedure and are directed at ensuring proper integration between the BTS and CBSC. ITPs may include special considerations for more complex BTS equipment configurations, such as double-density cages supporting multi-carrier operation where additional cabling requirements are present. A sample data sheet for a set of ITP tests conducted for deployment of 2nd carrier in Los Angeles is contained in Appendix 3A.

3.5

Procedure

The system engineer should coordinate with the CFE to evaluate the Optimization/ATP data sheets from each BTS to learn the following for all sites in a cluster under evaluation: 1. Check the last date of the BTS Optimization/ATP procedure (to ensure that the latest procedure was used) 2. BTS ATP datasheets should be reviewed to make sure all test results were within acceptable limits for each BTS and module. Particular attention should be paid to whether the Tx Cal and the Rx FER tests passed. (This tells the systems engineer that the Transmit path and modules are working properly and the Receive path and modules are working properly.) 3. Determine whether any modules were replaced or if any ATP tests failed initially and then subsequently passed. 4. Verify all cables were calibrated correctly. 5. Ensure that the latest cal file was backed up onto the OMC-R at the MTSO. This will also tell the system engineer when the site was calibrated last.

53

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test 6. Integration Test Procedure (ITP) data sheets should be reviewed to ensure that all tests have passed.

3.6

Analysis Conducted

If there were any problems identified while conducting the procedure above, this information should be entered into the Problem Resolution Matrix for this cluster to capture this historical information for future reference. Any repeat failures at a particular BTS, during either ATP or ITP, should be highlighted. One of the most common equipment installation problems found is with antenna installation. The problems may include incorrect antenna type/orientation/tilt, faulty cabling, and improper antenna mounting. Reasons for incorrect installation may include but not be limited to subcontracted work, overworked or non-trained antenna crews. To eliminate or reduce such occurrence each market should have specific guidelines to follow. The engineer responsible for the optimization team should conduct random site visits to spot-check installations. All problems encountered should be entered into the PRM so they can be tracked and a CFE dispatched to the site if necessary. The engineer should also coordinate with antenna crews to ensure timely a timely installation schedule.

3.7

Exit Criteria

The system engineer has verified the following for each BTS in the cluster (and first tier surrounding sites at a minimum): 1. 2. 3. 4. 5. 6. All BTS optimization/ATP tests have been completed. All the calibrations are correct. The calibration files are backed up to floppy and the OMCR. All modules are in service. The ITP has been successfully completed The site is on the air and working properly.

54

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test

Appendix 3A: ITP Checklists


,73 7HVW 3ODQ Site Name : ________________________________ Site Number : _________________ Site Model : _________________ New [ ] Upgrade [ ] Expand [ ] Other

[ ]

$OO WHVWV DQG GRFXPHQWDWLRQ DUH VSHFLILF RQO\ WR QG &DUULHU 6HFWLRQ  9HULILFDWLRQ DQG +DUGZDUH &RQILJXUDWLRQ
&RPSOHWH

,QFRPSOHWH

1$

$ )LOOLQ DWWDFKHG FDUG DXGLW

>

>

>
1$

&RPSOHWH

,QFRPSOHWH

% 9HULI\ DQGRU GLDJUDP FDEOLQJ  SHU FXVWRPHU GRFXPHQWDWLRQ & 9HULI\ 6SDQ OLQH KDV EHHQ LQVWDOOHG  DQG */,V FDQ FRPPXQLFDWH ZLWK &%6& ' 9HULI\ DQWHQQD WLOWV DQG D]LPXWKV )URP 1HW3ODQ 6LPXODWLRQ 6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU      

> >

@ @

> >

@ @

>
1$

@ @
1$

3DVV

)DLO

>

3DVV

)DLO

> > > > > >

@ @ @ @ @ @

> > > > > >

@ @ @ @ @ @

> > > > > >

@ @ @ @ @ @

6HFWLRQ  3LORW 31 2IIVHW 9HULILFDWLRQ &DQ EH YHULILHG ZLWK '0 6RPH SKRQHV PD\ EH SXW LQWR WHVW PRGH DQG XVHG WR YHULI\ 31 31 2IIVHW 6HFWRU  BBBBBBBBBB 6HFWRU  BBBBBBBBBB 6HFWRU  BBBBBBBBBB 6HFWRU  BBBBBBBBBB 6HFWRU  BBBBBBBBBB 6HFWRU  BBBBBBBBBB
3DVV )DLO 1$

> > > > > >

@ @ @ @ @ @

> > > > > >

@ @ @ @ @ @

> > > > > >

@ @ @ @ @ @

55

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test 6HFWLRQ  &DOO &RPSOHWLRQ DQG 0&& 7HVWLQJ 3ODFH FDOOV WR DQG IURP 0762  0RELOWR/DQG HDFK 6HFWRU  7HVWV PD\ EH FRPELQHG ZLWK 'LYHUVLW\ WHVWV
3DVV )DLO 1$

6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU

     

> > > > > >

@ @ @ @ @ @

> > > > > >

@ @ @ @ @ @

> > > > > >

@ @ @ @ @ @

/DQGWR0RELOH HDFK 6HFWRU  6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU 0RELOWR0RELO HDFK 6HFWRU  6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU            

3DVV

)DLO

1$

> > > > > > > > > > > >

@ @ @ @ @ @ @ @ @ @ @ @

> > > > > > > > > > > >

@ @ @ @ @ @ @ @ @ @ @ @

> > > > > >


1$

@ @ @ @ @ @ @ @ @ @ @ @

3DVV

)DLO

> > > > > >

56

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test
3DVV )DLO 'LYHUVLW\ 0RELOWRODQG FDOO ZDV VXFFHVVIXOO\ FRPSOHWHG  ZLWK 'LYHUVLW\ 5; 6HFWRU  > @ > @ UHPRYHG  ZLWK 3ULPDU\ 5; 6HFWRU  > @ > @ UHPRYHG  ZLWK 'LYHUVLW\ 5; 6HFWRU  > @ > @ UHPRYHG  ZLWK 3ULPDU\ 5; 6HFWRU  > @ > @ UHPRYHG  ZLWK 'LYHUVLW\ 5; 6HFWRU  > @ > @ UHPRYHG  ZLWK 3ULPDU\ 5; 6HFWRU  > @ > @ UHPRYHG  ZLWK 'LYHUVLW\ 5; 6HFWRU  > @ > @ UHPRYHG  ZLWK 3ULPDU\ 5; 6HFWRU  > @ > @ UHPRYHG  ZLWK 'LYHUVLW\ 5; 6HFWRU  > @ > @ UHPRYHG  ZLWK 3ULPDU\ 5; 6HFWRU  > @ > @ UHPRYHG  ZLWK 'LYHUVLW\ 5; 6HFWRU  > @ > @ UHPRYHG  ZLWK 3ULPDU\ 5; 6HFWRU  > @ > @ UHPRYHG &KDQQHO (OHPHQWV 3ODFH FDOOV RQ DOO UHPDLQLQJ FKDQQHO HOHPHQWV 1$

> > > > > > > > > > > >

@ @ @ @ @ @ @ @ @ @ @ @

1XPEHU RI 0&&V  BBBBBBB 1XPEHU RI 6HFWRUV  BBBBB

 0&& 0&& 0&& 0&& 0&& 0&& 0&& 0&&         > > > > > > > > @ @ @ @ @ @ @ @ > > > > > > > >

 @ @ @ @ @ @ @ @ > > > > > > > >

 @ @ @ @ @ @ @ @ > > > > > > > > >

 @ @ @ @ @ @ @ @ @ > > > > > > > >

 @ @ @ @ @ @ @ @ > > > > > > > > >

 @ @ @ @ @ @ @ @ @ > > > > > > > >

 @ @ @ @ @ @ @ @ > > > > > > > >

 @ @ @ @ @ @ @ @
1$

&RPSOHWH

,QFRPSOHWH

>

57

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test 6HFWLRQ  5HGXQGDQF\ 7HVWV SHUIRUPHG ZLWK DQ DFWLYH FDOO XS */,V )LUVW YHULI\ */, LV ,16 DQG */, LV ,16 67%< $ 3XVK UHVHW EXWWRQ RQ */, IRUFLQJ */, ,16 DQG */, 226 YHULI\ ZLWK &%6& % /RDG DQG (QDEOH */, 9HULI\ */, ,16 67%<  & 3XVK UHVHW EXWWRQ RQ */, IRUFLQJ */, ,16 DQG */, 226 YHULI\ ZLWK &%6& % /RDG DQG (QDEOH */, 9HULI\ */, ,16 67%<  ' 9HULI\ */, LV ,16 DQG */, LV ,16 67%<
&RPSOHWH ,QFRPSOHWH

1$

>

>

>

%%;V )LUVW YHULI\ DFWLYH %%;V ,16 DQG %%; LV ,16 67%< $ )RUFH DFWLYH %%; 226 DQG %%; ,16 E\ SXOOLQJ DFWLYH %%; RXW RI VORW % 5HLQVHUW %%; 5HORDG %%; DQG (QDEOH %%; & 9HULI\ %%; LV ,16 DQG %%; UHWXUQV WR ,16 67%< 3DVV )DLO 1$ ' 5HSHDW VWHSV $ % & IRU HDFK 6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU       > > > > > > @ @ @ @ @ @ > > > > > > @ @ @ @ @ @ > > > > > > @ @ @ @ @ @

6HFRQGDU\ 7LPLQJ 6RXUFH 7DNHRYHU $ &RQQHFW /0) WR 00, SRUW RQ &60 YHULI\ *36 $1' /)5 WLPLQJ VRXUFHV % 'LVFRQQHFW *36 $QWHQQD IURP WRS RI ED\ & 9HULI\ /)5 RU +62 WDNHV RYHU DQG WKDW YRLFH TXDOLW\ UHPDLQV WKH VDPH RQ FDOO ' 5HFRQQHFW *36 $QWHQQD WR WRS RI ED\
3DVV )DLO

1$

>

@ >

@ >

3RZHU 6XSSOLHV 7KLV WHVW FDQ DOVR FRYHU $ODUPV QG &DUULHU RQO\  $ 3XOO 3RZHU 6XSSO\  YHULI\  KROGV XS VKHOI DQG FDOO UHPDLQV WKH VDPH % 9HULI\ &%6& UHFHLYHV $ODUP PHVVDJH & 5HLQVHUW 3RZHU 6XSSO\  DQG YHULI\ &%6& $ODUP FOHDUV ' 3XOO 3RZHU 6XSSO\  YHULI\  KROGV XS VKHOI DQG FDOO UHPDLQV WKH VDPH ( 9HULI\ &%6& UHFHLYHV $ODUP PHVVDJH ) 5HLQVHUW 3RZHU 6XSSO\  DQG YHULI\ &%6& $ODUP FOHDUV
3DVV )DLO

1$

>

@ >

@ >

58

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test 6HFWLRQ  $ODUPV 6/3$ PRGXOHV  6LWHV QG &DUULHU RQO\ $ 3XOO 6/3$ PRGXOH RQH DW D WLPH YHULI\ &%6& 5(&(,9(6 $ODUP PHVVDJH % 5HLQVHUW 6/3$ PRGXOHV DQG YHULI\ &%6& $ODUP FOHDUV
3DVV )DLO

1$

&60  %'& 5HGXQGDQF\ $ 9HULI\ %'&   &60  DUH LQVHUWHG LQ VORWV % +DYH &%6& EULQJ %27+ &DUULHUV GRZQ /RDG &60  %'&   'LVDEOLQJ %'&   ZLOO EH HQRXJK  & (QDEOH &60  DQG %'&  WKHQ (QDEOH %'&  ' +DYH &%6& (QDEOH 0&&V DQG %%;V ( 5HSHDW VWHSV $ % & ' WR UHWXUQ WR &60 %'&   ) 3XOO RXW &60  UHVHDW /)5 WKHQ UHLQVHUW &60  3DVV )DLO 7KLV ZLOO UHHQDEOH /)5 WUDFNLQJ  > @ > @ >

6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU 6HFWRU

     

> > > > > >

@ @ @ @ @ @

> > > > > >

@ @ @ @ @ @

> > > > > >

@ @ @ @ @ @

1$

0&& $ODUPV QG &DUULHU RQO\ $ 3XOO HDFK 0&& RQH DW D WLPH 9HULI\ &%6& UHFHLYHV DQ $ODUP PHVVDJH IRU HDFK 0&& % 5HLQVHUW HDFK 0&& DQG 9HULI\ ZLWK &%6& $ODUPV FOHDU
3DVV )DLO

1$

> 6HFWLRQ +DQGRII 7HVWV $ 6RIWHU KDQGRII IURP 6HFWRU WR 6HFWRU


3DVV )DLO 1$

@ > @ @ @ @ @ @ > > > > > >

@ >
1$

@ @ @ @ @ @ @

3DVV

)DLO

     

WR WR WR WR WR WR

     

> > > > > >

@ @ @ @ @ @

> > > > > >

@ @ @ @ @ @

> > > > > >

@ @ @ @ @ @

     

WR WR WR WR WR WR

     

> > > > > >

@ @ @ @ @ @

> > > > > >

59

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test $QWHQQD 'LDJUDP 6KHHW 6LWH 1DPH  BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB 6LWH 1XPEHU  BBBBBBBBBBBBBBBBB 6LWH 0RGHO  BBBBBBBBBBBBBBBBB
1HZ ([SDQG > @ > @ 8SJUDGH > @ 2WKHU > @

 'UDZ LQ DOO FRQQHFWLRQV DQG GHYLFHV EHWZHHQ %76 DQG :DYH *XLGH (QWU\ 3RUW  9HULI\ WKDW DOO 7UDQVPLW DQG 5HFHLYH 5) &DEOHV DUH FRQQHFWHG WR FRUUHFW DQWHQQDV IRU ZKLFK WKH\ ZHUH DVVLJQHG WR 1RWH FDEOHV PD\ EH GLIIHUHQW WKDQ WKRVH VKRZQ

%76 32576
'7 $ '7 $ '5$  '5$  '7 % '7 % '5%  '5%  '7 * '7 * '5*  '5 *

'7 $

'7 $

'5$ 

'5$ 

'7 %

'7 %

'5% 

'5% 

'7 *

'7 *

'5* 

'5 *

$17(11$ 32576

60

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test

,73 ,QYHQWRU\ 6LWH 1DPH BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB 6LWH 1XPEHU  BBBBBBBBBBBBBBBBBBBBBBBB 6LWH 0RGHO BBBBBBBBBBBBBBBBBBBBBBBBBB 1HZ > @ ([SDQG > @ 8SJUDGH > @ 2WKHU > @ Card Name Part Number ATC Serial # CSM # 2 GLI # 3 GLI # 4 BDC # 3 BDC # 4 BBX # 5 BBX # 6 BBX # 7 BBX # 8 MCC # 21 MCC # 22 MCC # 23 MCC # 24 MCC # 25 MCC # 26 MCC # 27 MCC # 28 Power Supply # 3 Power Supply # 4 LPA # 7 LPA # 8 LPA # 9 LPA # 10 LPA # 11 LPA # 12

Card Status

EXTRA CARDS

Part Number

ATC Serial #

LOCATION

61

Optimization Procedures and Guidelines, Ver 1.2 Equipment Installation and Test

&HOOXODU )LHOG (QJLQHHU 6LJQDWXUH BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB 1HWZRUN )LHOG 0DQDJHU 6LJQDWXUH BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB

'DWH BBBBBBBBBBB 'DWH BBBBBBBBBBB

62

Optimization Procedures and Guidelines, Ver 1.2 Database Verification

4.0 Database Verification


4.1 Description

Figure 4.1-1 shows the relationship of this database verification activity to others within the network optimization flow. Database verification is the examination of key RFrelated parameters. These parameters are defined by either the system design (e.g. SIF power settings, neighbor lists) or network and equipment deployment considerations (e.g. ICBSC border placement). Parameter settings should comply with recommended, release-specific default parameters, which are available on-line. (See Section 4.2 for information on how to access these spreadsheets.) Since the network optimization engineer may not be the same person who generates the system databases or participates in the system design review (Chapter 2), this is an opportunity to become more familiar with the network design intentions and verify that the systems database is appropriately configured for system optimization. After the various database input tables representing RF parameter settings and neighbor lists have been created and the MIB has been generated using database tools (such as Condor=CM Tool), the information should be extracted from the system database and compared to the original design intentions to verify its accuracy. In addition, the transcoder database should be reviewed. Three basic sets of parameters are investigated in this activity: RF parameters, neighbor lists and supporting tables, and transcoder (XC) parameters.

63

Optimization Procedures and Guidelines, Ver 1.2 Database Verification

Optimization Preparation
Equipment Installation and Test Verification (Chapter 3)

Network Design Verification (Chapter 2)


Accurate Terrain, Clutter, Xlos Tuning Data System Design via NetPlan/CSSS

RF Parameters Database Verification (Chapter 4)

Spectrum Clearing (Chapter5)

Data Collection and Analysis Tools Selection, Install, Test (Chapter 6)

Network Optimization Single Cell Functional Test (Chapter 7) Initial Coverage Test (Chapter 8) System Optimization and Detailed Problem Resolution (Chapter 9) Final Coverage Survey and Warranty Testing (Chapter 10)

System Operations (Chapter 11)

Commercial Service: Network Performance Monitoring and Expansion (Chapter 11)

Figure 4.1-1: Relationship of Database Verification Activity to Entire Optimization Process

64

Optimization Procedures and Guidelines, Ver 1.2 Database Verification

4.2 Tools Required


There are a variety of tools available to inspect and evaluate the system database settings as listed in Table 4.2-1. Item Description and Vendor show_allparms script A script that compares the installed MIB to the recommended default RF parameter settings and generates a report of the differences. To obtain the script, click on the scripts button at: http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractic es.html and choose the hyperlink for the correct version of the "Show Parameters" script by Dennis Helm to download. This script also has the capability to provide differences between the installed databases examined on different days. get_mib Script that extracts the MIB and creates an output file; used by Xfreq. This script can be found at http://www.rochellepark.pamd.cig.mot.com/software.html; click on the xf-nbr.tar.Z link to download. xtract7_nlist Script that extracts the neighbor list from the MIB; used by Xfreq. This script is packaged in the xf-nbr.tar.Z file and can be downloaded from the above link. Xfreq Script that graphically displays neighbor lists and various parameter settings. This script can be found at http://www.rochellepark.pamd.cig.mot.com/software.html. Compas_NL The Compas_NL tool will generate a file that can be read into NetPlan. The neighbor list can then be displayed graphically. For more information and to obtain this script, the URL is: http://www.cig.nml.mot.com/~spresney/Compas_NL/Compas_N L.html Falcon Falcon is a Java-based, database visualization tool that provides insight into the contents of the MIB. It can run on a variety of platforms. More information and a demo version of the script can be found at http://www.pamd.cig.mot.com/~toolprod/falcon. Parameter Spreadsheet contains recommended default parameter settings for Spreadsheets each software release; URL: [Matt Dillon] http://www.cig.mot.com/cdma_ase/index.html System Commands Reference to all system commands, their syntax and sample Reference outputs. This manual can be downloaded from http://www.cig.mot.com/TED/docs.html click on the hyperlink "Online Product Documentation", choose the Supercell button, then the SC Product Family-CDMA button. Click on the hyperlink "OMCR/CBSC/SYSTEM" and then scroll down to "System Commands Reference". Table 4.2-1: Database Verification Tools and References

65

Optimization Procedures and Guidelines, Ver 1.2 Database Verification

4.2

Personnel Required
Type Skill Level White Belt (See Appendix A) Table 4.3-1: Personnel Required

Systems Engineer

4.4

Entrance Criteria

1. The RF parameters, Sectop, XASect, and XCSect tables are completed and are loaded into the system MIB. The XC database is installed as well. 2. Review the read_me files for the current versions of scripts and tools which will be used for any new information. 3. A copy of all network design outputs for the system is available to use as a baseline for database checks. 4. A copy of the parameters spreadsheet for the system release being deployed has been downloaded and printed for reference. 5. A copy of the ITP data sheets is available from the equipment installation verification activity (Chapter 3).

4.5
4.5.1

Procedure
MIB Parameter Audit

Appendix 4A contains reference information on how to use the show_allparms script to collect the information required for this exercise. 4.5.1.1 System Wide Parameters Install and run the show_allparms script and view the output files highlighting differences between the MIB and the recommended default RF parameters. Any discrepancies reported by show_allparms should be investigated to determine why they exist. Corrections should be made to the database as appropriate. Consultation with the system design team or customer may be required. The show_allparms scripts will enable a quick check of the parameters listed below: Call Processing: BusyidleUp, MmCpT3, MmCpT10, MMCpT11, MMCpT14, ToMMFep, PingMMfep, FepBundleflag, MM Bundling, FepMxWaitBundle. 4.5.1.2 BTS and Sector Parameters Using the output report from the show_allparms script, evaluate any discrepancies reported between the existing database and the recommended RF default parameters. Identify the reason for any discrepancies to determine if there is good reason to diverge from the recommended default values. Consultation with the system design team or customer may be required. The following categories of parameters are evaluated by show_allparms:

66

Optimization Procedures and Guidelines, Ver 1.2 Database Verification MAHO: T-ADD, T-DROP, T COMP, TTDrop. MM N-Way: MaxActSetSz, MaxCEPPerCall, MaxBTSLegs1, MaxBTSLegs2, MaxBTSLegs3, Softer Shuffle Comp, Soft Shuffle Comp, BTS Shuffle Comp, Enable Softer Shuffle, Enable BTS Shuffle, Enable Soft Shuffle, Num Candidate, SendHopermMess, ComplexShoENA. XC N-Way: Aggr Active Set 1BTS, Aggr Active Set 2BTS, Aggr Active Set 3BTS, XCTComp, TcompEnaTrsh. Neighbor Lists: NeighAssoc, SrchWinA, SrchWinN, SrchWinR, NghbrMaxAge. Reverse Power Control: RPCMaxEbNo, RPCMinEbNo, RPCNomEbNo, RPCUpPFull, RPCUpPNFull, RPCDownP. Forward Power Control, (Mobile): PwrRepThresh, PwrRepDelay, PwrRepFrames, PWRThreshEna, PwrPeriodEna. Mobile Initial Power: NomPwr, InitPwr, PwrSet. Forward Power Control CBSC/BTS: FwdPwrThresh, OrigDely, DeltaTime, StepDownDelay, OrigGain, FER_traget. PPS Powers: PilotGain, PchGain, SchGain, SifPilotPwr. Forward Traffic Channel Gain: MaxGain1Way, NomGain1Way, MinGain1Way, MaxGain2Way, Nomgain2Way, MinGain2way, MaxGain3Way, NomGain3Way, Mingain3Way, MinPcbGainFact, StepUp, StepDown. Access Channel: AccTmo, AchPaml, AchPamEbNo, AchPamWinSz, cell Radius, NumStep, MaxCapSz, PamSz, Psist0to9, Psist09, Psist10, Psist11, Psist12, Psist13, Psist14, Psist15. MsgPsist, RegPsist Reverse Traffic Channel: TchPamEbNo, TchPamWinSz, TchPamlper, TchAcqWinSz, TchAcqEbNo, TchAcqlPer, TchMpthWinSz, TchMpthebNo, TchMpthlper, MccCpT1. ____________________________________________________________________

67

Optimization Procedures and Guidelines, Ver 1.2 Database Verification 4.5.1.3 SIF Pilot Powers and Antenna Tilt/Azimuth SIF Pilot Powers Using the outputs from the network design activity (Chapter 2), evaluate the installed value of each sectors SIF pilot power to ensure that the value matches the recommendation provided by the system design activity. The show_allparms script by Dennis Helm also shows the SIF pilot powers by CBSC. As an alternative, to obtain the SIF pilot powers on a per BTS/sector basis from the MIB, log onto the OMC-R, open a CLI window and type the command "disp carrier-bts#-sector#-carrier# ppsgain". The items in bold are to be typed as shown, the items in italics need to be filled in with the correct BTS #, sector # and carrier #. To learn more about this command, see the System Commands Reference, Volume 3, Chapter 11. Antenna Tilt and Azimuths At this time it would be convenient to use the ITP data sheets collected during the equipment installation verification activity (Chapter 3) to verify that the installed antenna tilts match the system design tilts. Proceed through each site and sector and confirm that the installed tilt angle on each sector is identical to the desired simulation activity output. Note any discrepancies for future reference. 4.5.1.4 Neighbor List Check 4.5.1.4.1 Neighbor List (Sectop) Check Three tools are available for reviewing the network neighbor list. These tools are: Compas_NL Xfreq (in conjunction with extract7_nlist and get_mib) Falcon The intent at this starting point of the network optimization cycle is to ensure that the installed neighbor lists meet the intent of the system design and make sense from a practical perspective (do they pass the common sense test). The tools listed above can be used to graphically display the network neighbor lists. Select one of the tools to facilitate this audit, and proceed using the following basic checklist to prioritize the initial neighbor lists: Adjacent sectors at the same site should be included in each others neighbor lists, and be positioned at the top of the list. For six sector system, sectors on the other side of the site should be maintained, in general, in the top 11 entries in the neighbor list. Sectors pointing towards each other should be in each others neighbor lists. Sectors pointing into the same coverage areas should be in each others neighbor lists. These should be prioritized based upon amount of coverage overlap. Special cases may include: Sectors facing in the same direction (azimuth) where one sector overshoots another site: If there is desired coverage overlap, then the sectors should be

68

Optimization Procedures and Guidelines, Ver 1.2 Database Verification neighbored. (E.g., This may be typical of sites that are at the base of a mountain, and are only one tier away from each other.) Sectors separated by terrain obstacles: For example, if there are two clusters, but they are separated by a mountain range and could not enter into soft handoff with each other because their coverage footprints do not overlap, there is no need to put them in each others neighbor lists. Make use of the elevation data in NetPlan using the Profile function to confirm any terrain obstacles, or use the Best Server Ec/Io image to determine whether coverage footprints overlap.

Additional checks for neighbor list development are listed here: - Verify that reciprocal neighbors are entered into each others sectors neighbor lists. - Verify that neighbors for a specific sector are within the cell radius limits (eliminate distant neighbors). 4.5.1.4.2 Neighbor List Support Tables In addition to the sectop tables, there are related tables whose configuration must be consistent with the neighbor lists to facilitate various types of handoffs. These handoffs include D-to-A, D-to-D (inter-CBSC Soft Handoff and CBSC anchor handoffs), multicarrier handoffs, and inter-EMX handoffs. The discussion of each type of handoff is beyond the scope of this chapter; however, high level guidance, including specific references and tools that may be used are discussed below to facilitate the investigation and verification of these database entries. More specific guidelines are planned for releases of this document. XASect and XCSect Verification The Falcon tool is intended to graphically display various portions of the database and do sanity checks on linkages contained within the MIB. This should enable consistency checking of XASect and XCSect tables for systems that are configured to use these designations. Inter-CBSC Soft and Anchor Handoff Configurations Tables required for inter-CBSC soft handoffs are discussed in the ICBSC application note at http://www.pamd.cig.mot.com/nds/cts/rftech/App_Notes/icsho/. Select the hyperlink entitled icsho_CAN_v0_1.fm. A brief list of the most important checks for implementing XASects and ICBSC-SHO and anchor handoffs is included here: 1. XCSects must have their HandoffMethod indicator set to Soft_Trunking. Also, the HOOverride attribute associated with parent ICTRKGRP (identified in the XCSect) must be set to No_Override. 2. The Outward Route Index (ORI) for a XCSector should be non-zero, indicating there will be outward route traversal.

69

Optimization Procedures and Guidelines, Ver 1.2 Database Verification 3. The AnchorHoMethod is set at a CBSC level to one of four choices: No_Legs, Legs_Remote, No_Legs_Wait, or Keep_Soft (Simple). 4. A value for OmcGroup must be provided for all OMC-R. The value may range from 1-8 and up to 8 CBSCs may be in each group. 5. The SELECTMODE parameter is set for either Round_Robin or Top_Bottom resource allocation. More information on these checks, as well as additional CLI commands and database configurations, can be found in the section "Database Configuration" in the chapter entitled "Implementation" of the Inter-CBSC Soft Handoff Cellular Application Note. Configuring the following EMX database tables facilitates inter-EMX handoffs (found along some CBSC borders) if the anchor handoff method is not set to Keep_Soft: 1. The BSS BSSRTE database entry on the EMX contains the Destination Point Code (DPC) for A+ messaging and the BSS Trunk Group for the Terrestrial Circuits to the target BSS. 2. The BSS CELRTE database entry on the EMX tells the target EMX where to find the target BTS. This entry will point to the BSS BSSRTE above, giving the EMX the required information to send A+ messages and set up Terrestrial Circuits to the correct target BSS. These tables should be evaluated for accuracy. Reference documentation to enable this investigation is included in the section "EMX Commands" in the chapter titled "Implementation" of the Inter-CBSC Soft Handoff Cellular Application Note. It is advisable to always check the FYIs concerning ICBSC-SHO at the following web page: http://www.pamd.cig.mot.com/nds/cts/rftech/App_Notes/icsho/icsho.html#fyis for the latest information. Multi-carrier Configurations Database configurations for multi-carrier operation, in particular pilot beacon or DAHO database setup can be investigated by referring to the Multi-carrier Cellular Application Note. This can be found at: http://www.rochellepark.pamd.cig.mot.com/~fcleary/appnotes.html 4.5.2 XC Parameter Audit Appendix 4B contains a detailed procedure on how to access and examine specific transcoder related parameters. Using this procedure, the following XC parameters should be verified: XcHoT7, XcHoT1, XcHoT5, XcHoT3, XcHoT6, XcHoT2, XcHoT5, HHORetryCnt, PMRM_Threshold, handoff_mode, XC State1 - XC State 11, XcSoT7, XcSoT8, Acquisition Count, Fast Repeat, Max Retry, RF Loss Count, Retry Timer Count 70

Optimization Procedures and Guidelines, Ver 1.2 Database Verification

4.6

Analysis Conducted

If any parameters are set differently in the database than the recommended default parameters or desired system design parameters, or there are any problems identified in any of the database tables, these discrepancies should be documented. Following that reporting, analysis must be done to discover the reason for the discrepancy. Errors not meeting the intent of the network design activity or desired default parameters should be corrected. Some parameters may be different from the recommended default because of optimization done in the simulation domain (e.g. SIF pilot powers). SIF powers should be checked against the system design. All discrepancies that have been documented should be worked off in the form of a punch list until all parameter settings are acceptable and agreed upon. Neighbor lists and supporting table issues should be resolved in a similar manner. Transcoder database values should be verified as well.

4.7

Exit Criteria

- All RF parameter checking script output reports have been reviewed to identify discrepancies between the installed MIB and the recommended default parameters. - Any errors have been identified and the parameters have been corrected. - Any settings not meeting the intent of the system design have been corrected. - Any missing neighbors have been added to the neighbor list. - Any undesired neighbors have been deleted from the neighbor list. - All handoff tables and settings, including XASect and XCSect, AnchorHoMethod, ORI, BSSRTE, CELRTE should all be set correctly to facilitate inter-CBSC soft and anchor and inter-EMX handoffs as required. - The transcoder database has been investigated and deemed appropriate for this network design.

4.8

Recent Developments

A new database visualization tool will be available in early March, 1999. This tool is called Falcon. Information on Falcon can be obtained by going to: http://www.pamd.cig.mot.com/~toolprod/

71

Optimization Procedures and Guidelines, Ver 1.2 Database Verification

APPENDIX 4A: show_allparms usage


This information has been recreated from RF Database Audit Procedure, February 5, 1999, Dennis Helm. Note: For each OMCR release and vocoder rate, there is a version of 'show_allparms'. Verify that the correct version of 'show_allparms' is being used. Execution of Script 1. All of the RF-audit tools will be located in the 'rf-audit' directory. If this directory does not exist then create it by typing the following command: omcr scadm}$ mkdir rf-audit 2. Copy show_allparms to /home/scadm/rf-audit of OMCR. 3. Change permissions to make the script executable: omcr scadm}$ chmod +x show_allparms 4. Execute script: omcr{scadm}$ show_allparms 5. Evaluate output. The following are the output files that are generated by 'show_allparms': <date>.allparms.out This file is comma delimited ASCII text and can be easily imported into Microsoft Excel. See the example below. diff_rec.out The file 'diff_rec.out' contains all of the parameters that are not set as per (Dillon's) default spreadsheet as shown in the example below. The audit engineer should be ready to discuss these parameter changes with the customer. A good reference document is Matt Dillon's 'Parameter and Optimization Guide' located at the following web site: http://www.cig.mot.com/~dillon All differences should be noted in a final report along with parameter descriptions.

72

Optimization Procedures and Guidelines, Ver 1.2 Database Verification Example of diff_rec.out script: The Recommend values are listed as the first row of each Section The values are from Dillons spreadsheet dated 10/16/97 These recommended values are based on a 13k Vocoder and MCC-8 system These parameters are listed in parenthesis BTS-SEC PARAMATER = Current Value (Recommended Value) 2-2-1 TTDROP=2 (3) 7-1-1 TDROP=-15 (-16) 10-1-1 TTDROP=2 (3) 12-2-1 TTDROP=4 (3) 16-1-1 TTDROP=2 (3) 16-1-1 SRCHWINA=7 (6) 7-3-1 CELLRADIUS=5.7 (10) 8-2-1 CELLRADIUS=8.6 (10) 9-2-1 CELLRADIUS=11.5 (10) 8-2-1 TCHPAMWINSZ=75 (25) 58-1-1 TCHPAMWINSZ=66 (25) Example of <date>.allparms.out script: The Recommend values are listed as the first row of each Section The values are from Dillons spreadsheet dated 10/13/97 These recommended values are based on a 13k Vocoder R7 system BTS,SEC,CAR,PILOT PN,TADD,TCOMP,TDROP,TTDROP,SRCHWINA,SRCHWINN,SRCHWINR,NGHB ORMAXAGE Rec,Val,,,-14,7.5,-16,3,6,8,9,0 1,1,1,174,-14,7.5,-16,3,6,8,9,0 1,2,1,180,-14,7.5,-16,3,6,8,9,0 1,3,1,177,-14,7.5,-16,3,6,8,9,0 2,1,1,255,-14,7.5,-16,3,6,8,9,0 2,2,1,261,-14,7.5,-16,2,6,8,9,0 2,3,1,258,-14,7.5,-16,2,6,8,9,0 3,1,1,246,-14,7.5,-16,3,6,8,9,0 3,2,1,252,-14,7.5,-16,3,6,8,9,0 3,3,1,249,-14,7.5,-16,3,6,8,9,0

73

Optimization Procedures and Guidelines, Ver 1.2 Database Verification

APPENDIX 4B: Procedure to Evaluate Transcoder Parameters


(Taken from Application Note entitled Changing Transcoder Parameter Procedure by Jim Woeste.)

Step 1) Open a window on the X-Term and at the scadm prompt (in the /home/scadm/rf-audit directory) type the following command: tqcomm a0 9600 Wait a couple of seconds and hit the <RETURN> key until you see the following XC prompt. Cust BSS MMI-0115-> The above prompt tells you that you are entering the XC through the OMP by the numbers 0115 at the end of the prompt. The OMP is located in Cage 0 slot 20 in every XC. If you see another number at the end of the prompt you should exit your tqcomm session by typing the following at the XC prompt (0XXX). Cust BSS MMI-0XXX-> ; Cust BSS MMI-0XXX-> q Move your RS-232 cable from whatever GPROC it is on to the RS-232 connector on the OMP and type in the tqcomm command again at the scadm prompt. Once you see the 0115 prompt type in the following commands. Cust BSS MMI-0115-> ; (; asks you for a command) Cust BSS MMI-0115-> l (l is the command for logging) Starting logging. Log File : rf_xc_audit.out Log file created. Logging on. You may lose your prompt for a second but keep hitting <RETURN> until it comes back, (usually a few seconds after entering the log file name). Once you see the prompt again proceed to Step 2). Step 2) Once you see the prompt again type: Cust BSS MMI-0115-> chg_lev

74

Optimization Procedures and Guidelines, Ver 1.2 Database Verification Enter password for security level you wish to access: 5cardstud (if 5cardstud doesnt work try the following) Enter password for security level you wish to access: 3stooges Enter password for security level you wish to access: 4beatles (The 3stooges password will prompt you for the 4 beatles password) 5cardstud (as well as the other 2 passwords) will not be displayed when you enter it. This will give you security level 3 access (which is unlimited access). You will need to have security level 3 access to display the set_state_timeout values on the XC. Step 3) To display the call processing parameters, type the following command at the XC prompt: Cust BSS MMI-0115-> disp_cp_params This will give you the following XC parameters. Call processing timers: XcCpT2, XcCpT5 Handoff timers: T1, T3, T5, T7, T6 Step 4) The set_state_timeout command provides you with each of the State Timer values on the XC. DO NOT ENTER A VALUE WHEN YOU SEE THE PROMPTS WITH THE CURRENT SETTING IN THEM UNLESS YOU ARE MAKING A CHANGE. IF YOU JUST WANT TO DISPLAY THE CURRENT SETTING THEN HIT ENTER WITHOUT ENTERING A VALUE. A PROMPT WITH A CURRENT VALUE IS SHOWN BELOW AS AN EXAMPLE. Enter the Timeout Value (0-86399999 in ms)(cur = 2500 ms): The set_state_timeout command will only allow you to view one State Timer at a time; therefore, you will need to enter the command 11 times. Each time you re-enter the command you will be prompted for the State Timer you wish to display/change. See the example below for the command sequence as well as the current settings in Lombard. Notice that this only displays the values and does NOT change them in the example below. Cust BSS MMI-0115-> set_state_timeout Enter the CDMA CP State Number (0-20): 1 Enter the Timeout Value (0-86399999 in ms)(cur = 2500 ms): Cust BSS MMI-0115-> set_state_timeout Enter the CDMA CP State Number (0-20): 2 Enter the Timeout Value (0-86399999 in ms)(cur = 5000 ms):

75

Optimization Procedures and Guidelines, Ver 1.2 Database Verification Cust BSS MMI-0115-> set_state_timeout Enter the CDMA CP State Number (0-20): 3 Enter the Timeout Value (0-86399999 in ms)(cur = 7000 ms): Cust BSS MMI-0115-> set_state_timeout Enter the CDMA CP State Number (0-20): 4 Enter the Timeout Value (0-86399999 in ms)(cur = 5000 ms): Cust BSS MMI-0115-> set_state_timeout Enter the CDMA CP State Number (0-20): 5 Enter the Timeout Value (0-86399999 in ms)(cur = 5000 ms): Cust BSS MMI-0115-> set_state_timeout Enter the CDMA CP State Number (0-20): 6 Enter the Timeout Value (0-86399999 in ms)(cur = 5000 ms): Cust BSS MMI-0115-> set_state_timeout Enter the CDMA CP State Number (0-20): 7 Enter the Timeout Value (0-86399999 in ms)(cur = 5000 ms): Cust BSS MMI-0115-> set_state_timeout Enter the CDMA CP State Number (0-20): 8 Enter the Timeout Value (0-86399999 in ms)(cur = 5000 ms): Cust BSS MMI-0115-> set_state_timeout Enter the CDMA CP State Number (0-20): 9 Enter the Timeout Value (0-86399999 in ms)(cur = 5000 ms): Cust BSS MMI-0115-> set_state_timeout Enter the CDMA CP State Number (0-20): 10 Enter the Timeout Value (0-86399999 in ms)(cur = 2000 ms): Cust BSS MMI-0115-> set_state_timeout Enter the CDMA CP State Number (0-20): 11 Enter the Timeout Value (0-86399999 in ms)(cur = 10000 ms): Step 5) To verify that all layer 2 parameters are set correctly type the following command at the XC prompt. Cust BSS MMI-0115-> Cust BSS MMI-0115-> disp_l2_para Enter the XCDR identifier (0 - 85): 0 <--- Where the XCDR identifier is the XCDR card number. This implies that for each XCDR card in your system you will need to repeat the disp_l2_para command. Step 6) You can end your tqcomm session by typing the following command at the XC prompt. Cust BSS MMI-0115-> ; Cust BSS MMI-0115-> q

76

Optimization Procedures and Guidelines, Ver 1.2 Database Verification Your prompt should now be the same type of scadm prompt on the X-Term window that you had before the tqcomm session began. Your logfile from the tqcomm session should be in the UNIX directory you are located in (/home/scadm/rf-audit). Type ls at the scadm prompt to see the logfile you created. ------------------------------------------------------------------------------------------------------------------NOTE: IF YOU HAVE PROBLEMS GETTING THE TQCOMM SESSION TO WORK - PERFORM THE FOLLOWING STEPS. MAKE SURE THAT THE REASON YOU ARE UNABLE TO START A TQCOMM SESSION IS NOT DUE TO HAVING MORE THAN ONE TQCOMM SESSION OPEN AT THE SAME TIME. STEP 1. Type at the scadm prompt, the commands in bold below. lombardomc2{scadm}$ cd /bin lombardomc2{scadm}$ cfreset asynch# port# Where # in the asynch# word is the asynch card your are using on your OMCR. Where # in the port word is the port number on the asynch card your are using Usually the asynch# word is asynch0 and the port# word is port0 Example; omc{scadm}$ cd /bin omc{scadm}$ cfreset asynch0 port0 STEP 2. Now try using the tqcomm command to access the Transcoder. If your still unable to get a transcoder prompt then try the following process. A) Remove the RS-232 cable/connector from the OMP. B) Using another RS-232 cable, hook up a dumb terminal to the OMP before turning the dumb terminal on. C) Turn the dumb terminal on. XC Alarms should be scrolling across the screen. You have successfully unlocked the RS-232 port for communication with the OMCR. D) Un-plug the new RS-232 cable and hook up the old RS-232 cable/connector. The TQCOMM session should work fine. If not, call MSCS and open a call log.

77

Optimization Procedures and Guidelines, Ver 1.2 Spectrum Clearing, Noise Floor Test Verification, and Noise Monitoring

5.0 Spectrum Clearing, Noise Floor Test Verification, and Noise Monitoring

5.1

Description

The purpose of spectrum clearing is to eliminate any non-CDMA RF interference on the forward and reverse CDMA links throughout the area of network operation. The presence of any additional noise in the network could adversely impact the coverage and capacity of a CDMA cell. It is possible that the spectrum targeted for CDMA operation has not been cleared of AMPS channels operation (in the 800 MHz band). This may be more of an issue in areas where CDMA operation is targeted, but immediately adjacent areas of operation are still using some of the CDMA channels for AMPS operation. Similarly, in the case of PCS or non-domestic systems, existing microwave or other spectrum users need to be cleared from the frequency band that CDMA is targeted to operate in. Since it is the customers responsibility to ensure that spectrum is clear, the focus of this chapter is to provide a description of key indicators that signal increasing levels of interference which would trigger follow up spectrum clearing or noise floor testing activities at particular sectors or sites. These triggers should be monitored during SCFT (Chapter 7) and beyond through the entire optimization activity and into commercial service. Once a problem is identified, via examination of either system data (alarms) or drive test data, the procedure referenced in this section can be used to further characterize the interference at a particular cell site or sector. Figure 5.1-1 (next page) shows the relationship of the initial spectrum clearing verification to other optimization activities. After that, the monitoring techniques presented in this chapter can be used to continually identify noise in the network.

78

Optimization Procedures and Guidelines, Ver 1.2 Spectrum Clearing, Noise Floor Test Verification, and Noise Monitoring

Optimization Preparation
Equipment Installation and Test Verification (Chapter 3)

Network Design Verification (Chapter 2)


Accurate Terrain, Clutter, Xlos Tuning Data System Design via NetPlan/CSSS

RF Parameters Database Verification (Chapter 4)

Spectrum Clearing (Chapter5)

Data Collection and Analysis Tools Selection, Install, Test (Chapter 6)

Network Optimization Single Cell Functional Test (Chapter 7) Initial Coverage Test (Chapter 8) System Optimization and Detailed Problem Resolution (Chapter 9) Final Coverage Survey and Warranty Testing (Chapter 10)

System Operations (Chapter 11)

Commercial Service: Network Performance Monitoring and Expansion (Chapter 11)

Figure 5.1-1: Relationship of Spectrum Clearing Activity to Entire Optimization Process

79

Optimization Procedures and Guidelines, Ver 1.2 Spectrum Clearing, Noise Floor Test Verification, and Noise Monitoring

5.2

Tools Required

The tools listed in Table 5.2-1 can be used to detect and isolate noise or interferenceinduced problems. Item
Compas Event Logs

Description or Vendor Used to evaluate DM and SMAP data. See Chapter 6 (Tools
Selection) for more information Provide a history of possible service affecting conditions. See System Performance Monitoring Guide. (Separate document.) Recently developed tool can be found at: http://www.cig.nml.mot.com/cdma/kctopt/tools/. Generates a view of BBX Balance and BBX Reverse Noise Rise Alarms for a particular site over a given hour of operation. Input is event logs. http://scwww.cig.mot.com/~thakkar/smap.html

banditview SMAP

Table 5.2-1: Interference Isolation Tools In the event that a problem is identified at a certain site or sector, the actual procedure for performing noise floor testing is entitled CDMA Uplink Noise Survey Procedure. This procedure can be located at http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices.html under the RF Planning option. This document contains specific references to equipment used at a cell site to characterize the noise. Substitution of specific equipment may be necessary to compensate for differences in frequency and equipment variations from market to market.

5.3

Personnel Required

Type Skill Level RF System Engineer, to oversee noise White Belt (See Appendix A) monitoring and elimination activities Cellular Field Engineer, to execute Noise See Appendix A Floor Test Procedure CBSC Engineer, to monitor noise See Appendix A indicators Table 5.3-1: Personnel Required The number of locations to have noise floor testing performed will dictate the number of teams and the time required to complete this test. For further information, please reference the CDMA Uplink Noise Survey Procedure.

5.4

Entrance Criteria

The activity of noise monitoring should commence once the customer has indicated that the spectrum has been cleared to the best of his ability, and the system is ready for operation. In general, the following are triggers or inputs that could be used to trigger the execution of the noise survey procedure at a cell site/sector:

80

Optimization Procedures and Guidelines, Ver 1.2 Spectrum Clearing, Noise Floor Test Verification, and Noise Monitoring 1. Interference may be identified by excessive BBX Balance or BBX Reverse Noise Rise Alarms in the event logs generated at the OMC-R. 2. Drive data from SCFT or optimization activities may indicate that the forward or reverse link is suffering from high FER. This may include temporal variations of noise levels, which are more difficult to isolate. 3. Status report from the customer that indicates the spectrum clearing results. This will identify any area that the customer was unsuccessful at identifying and eliminating any potential interferers. The service provider should be able to migrate his users from the band being cleared to other AMPS channels. However, there is no guarantee that the system being installed will not be subject to intersystem interferenceespecially along boundaries where the service provider has not cleared the CDMA band of analog users in immediately adjacent geographic areas. (This type of scenario may be encountered when a service provider wants to offer CDMA service in a core area, then expand the CDMA system boundaries on a piecemeal basis.) Nor is there any guarantee for new spectrum allocations that there will be no illegal spectrum users once the band is cleared of traditional analog users. The customer may require assistance in characterizing the noise sources.

5.5

Procedure

Identifying areas affected by interference can be accomplished by either monitoring the BBX Reverse Noise alarms or BBX Balance alarms that appear on the system, or via active drive testing. Noise present on the forward link does not confirm noise on the reverse link, and vice-versa. Each condition must be checked separately. 5.5.1 Forward Link Noise Drive testing with either a mobile or spectrum analyzer is the only method that will identify noise present on the forward link. A typical scenario of finding noise on the forward link will be consistent or intermittent call failures at a specific location coupled with increase in Mobile Receive (MobRx) levels, elevated FER, and degraded Ec/Io on the forward link. If this happens, the engineer should consult with the customer to determine if the area is a known problem (interference) area. If the noise source is understood, then it should be documented so no more time is wasted on this area. If the noise source is not understood, the engineer should investigate with a directional yagi antenna feeding into a spectrum analyzer tuned to the CDMA band of interest to isolate and identify the source of interference. If the type of interference can be characterized, this information should be provided to the customer organization to eliminate the interference. (In the case of forward link interference, it is not necessary to execute the noise floor testing procedure at a cell site.)

81

Optimization Procedures and Guidelines, Ver 1.2 Spectrum Clearing, Noise Floor Test Verification, and Noise Monitoring 5.5.2 Reverse Link Noise

The most efficient method to identify noise issues on the reverse link is to perform routine checks of the alarms. The use of alarm data to trigger and justify the effort to execute the noise floor testing is discussed in Section 5.6.1. A new tool to identify presence of excessive noise on the reverse link is called banditview. The only drawback with using alarms is that noise levels must exceed specific thresholds before the alarms are triggered and recorded in the event log. There may be other noise present, but not enough to register the alarms. Alternatively, SMAP provides both main and diversity RSSI data for all the three/six sectors. This helps to ensure that the spectrum is clear of any unwanted noise on the CDMA band before putting any commercial traffic on the system. SMAP provides reverse link frame erasure rates (FER) can be collected during drive test activities. The mobile must be in Markov mode in order to collect valid SMAP Reverse FER data. Follow the guidelines in the SMAP documentation to set up the proper SMAP profiles to collect RFER data for the Markov mobiles used during the drive. (In areas near ICBSC borders, note that Markov calls will not perform ICBSC handoffs.) The usage of SMAP data to trigger the noise floor investigation is described in the Section 5.6.2. It also helps to determine if main and diversity antennas are connected properly. An example of problem solved is that inadvertently many times during installation cables are swapped and sometimes two diversity antennas are swapped. Although system engineers can indicate that main Rx antennas are swapped by looking at a PN value, they cannot confirm if diversity antennas are swapped without using SMAP because there is no such tool that can show this data. For information on running SMAP go to
http://scwww.cig.mot.com/~thakkar/smap.html

5.6
5.6.1

Analysis Conducted
Use of (Reverse Link) Alarm Data to Trigger Noise Survey Procedure

An increase in the number of BBX Reverse Noise alarms or BBX Balance alarms is a primary indication of interference on the system. This information can be obtained from the system event logs. The event logs are located at the CBSC on the OMC-R in the /sc/events directory. Alarms are time stamped so it is possible to check for interference during a specific period(s) of time. This information should be monitored over a period of time to characterize alarm variations. The alarm variations should direct selection of times to visit cell sites for noise floor testing. A few days or a week should be a sufficient amount of time to establish noise trends.

82

Optimization Procedures and Guidelines, Ver 1.2 Spectrum Clearing, Noise Floor Test Verification, and Noise Monitoring A script called banditview can provide an indication of the number and severity of BBX Balance and BBX Reverse Noise Rise Alarms for any given hour of operation. This tool can be used in conjunction with drive testing. If interference is suspected during a specific period of time in a specific location, this script will report alarm conditions on a BTS/sector during a specific hour for the script user. This script is described in Section 5.8 (Recent Developments), and can be retrieved from: http://www.cig.nml.mot.com/cdma/kctopt/tools/. For more information on long term monitoring of the networks event logs please refer to the System Performance Monitoring Document, section TBD Event Logs. 5.6.2 Use Of (Reverse Link) SMAP Data To Trigger Noise Survey The engineer can use the SMAP (System Monitoring Application Processor) data generated by any Markov mobile specified in the SMAP profile to identify poor RFER. The engineer should look at the RFER data (either plotted using a tool such as COMPAS or looking directly at the SMAP messaging) to identify areas that are worse than 3 to 5%. He should also look for areas that may have poor Mobile Transmit (MobTx) values or instances when the MobTx approaches maximum power then falls to < 30 dBm (mobile shuts off) as either case may indicate interference. If areas of interference are identified the engineer should request that a CFE perform noise floor testing at specific BTS/sector(s) surrounding the problem area. Use the CDMA Uplink Noise Survey Procedure document. These procedures are specific to conversion of an 800 MHz system from analog to CDMA operation. For different systems, appropriate modifications should be made to the equipment setups and frequencies investigated. Results of the noise floor survey should be discussed with the customer to identify an appropriate resolution plan.

5.7

Exit Criteria

Noise monitoring is an ongoing activity. At this stage (prior to start) of the optimization cycle, it is important to verify that: - Spectrum is cleared if any previous analog users or other system users. - Adjacent geographic areas are cleared of CDMA band analog users. - Any noise floor testing procedures that have been executed by CFEs is shared with optimization engineering team. Any interference problems during this testing have been resolved.

5.8
5.8.1

Recent Developments
The banditview Script

5.8.1.1 Description Given a specific BTS number, the event logs containing information for that BTS, and a specific time of interest (in hours) as inputs, the banditview script will provide a 83

Optimization Procedures and Guidelines, Ver 1.2 Spectrum Clearing, Noise Floor Test Verification, and Noise Monitoring graphical view of "BBX Reverse Noise Rise Alarms" and "BBX Balance Alarms" for that specific time period. The BBX reverse noise rise alarms and BBX balance alarms can help spot potential interference from any external or equipment inband sources of noise in a specified area. This interference is expressed in terms of a bandit index. This index can be calculated for an area with the following formula: Using banditview, count the number of minutes in which BBX alarms were reported for a specific hour and CBSC. Then total the minutes for all sectors under the given CBSC. Divide the total by the number of sectors. The result will be a number between 0 and 60. Since this is a relative measurement, depending upon the number of users and external influence, a baseline should be developed and then tracked against. 5.8.1.2 Usage This is a UNIX script. The command line is: <directory containing script>/banditview <bts> <eventlog> where: bts = the specific BTS number eventlog = the event log for the specific hour (e.g. evt.19990418010005, year/month/day/hour). If necessary, provide the path to the event log location. 5.8.1.3 Output The X axis represents time in minutes over one hour and should be read vertically (e.g. first time is 00, second 05, third 10, etc.), and the Y axis represents the Sector and Carrier. An example of the output follows: (*------- BBX Noise and Balance Alarm Report for Site 000 -------*) No Noise = . Noise Rise Alarm = o BBX Balance Alarm = O

Time 0 0 1 1 2 2 3 3 4 4 5 5 5 BBX 0 5 0 5 0 5 0 5 0 5 0 5 9 --- +----+----+----+----+----+----+----+----+----+----+----+----+ 1 .oo..oo.0.ooooooooooooo.oooo.000ooo 2 ..oooooooooo0000ooo.000000000000000.ooo00 3 .0000000000ooo. 4 ..oooooooooo.ooo..ooo..ooooooo..o 5 ..ooooooooooooooooooooooooooooooooooooooooo. 6 .oooooooooooooooooooooo.oooooo..ooo.ooo 21 oooooooooooooooooooooooooooooooooooooooooooooooooo 22 oooooooooooooooooooooooooooooooooooooooooooooooo.. 23 24 ooooooo.oooooooo.oooooooooooo. 25 .o.o.o.o.oooooo.oooo.o.oo.ooo.o.oooooooooooooooooo 84

Optimization Procedures and Guidelines, Ver 1.2 Spectrum Clearing, Noise Floor Test Verification, and Noise Monitoring 26 oooooooo..ooooooooooo.ooooooooooo.ooooooooooooooo.. Figure 5.8-1: Output of banditview script 5.8.1.4 Credits This script was written by Jonathan Hutcheson of MJL.

85

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test

6.0 Tools Selection, Installation, and Test


6.1 Description

The purpose of this section is to offer guidelines for selection of CDMA data collection and analysis tools used in the RF optimization of the CDMA system. Motorola and other vendors are constantly upgrading their product offerings. Many of these offerings are captured in this chapter for reference. Each account team should use the guidelines and references within to select the tools to meet their specific deployment requirements. The relationship of this activity to other network optimization activities is shown in Figure 6.1-1 below. Figure 6.1-2 shows an overview of the general classes of data collection and analysis tools that are used during network optimization activities. Some of the tools or scripts used to collect and analyze data Motorola products or created by Motorola employees. Typically, these tools operate on proprietary or customer sensitive data outputs of the system. Some classes of tools, such as mobile diagnostic monitors, are more generic, and may be procured from any number of vendors. Appendix 6A contains more detailed information on both Motorola and non-Motorola tools. (All tools should come with installation and operation guides. The details of installation, test, and operation for each tool are beyond the scope of this discussion. Please refer to specific tool documentation for necessary information.)

86

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test

O p tim iz a tio n P re p ar a tio n


E q u ip m e nt Insta lla tio n a nd T e st V erific atio n (C hap te r 3)

N e tw o r k D esig n V er ific a tio n (C h ap ter 2 )


A cc urate T e rra in, C lutte r, X lo s T u ning D a ta S ys te m D e sign v ia N e tP la n/C SS S

R F P a ra m e te rs D a ta ba se V e rific atio n (C hap te r 4)

Sp e ctru m C le arin g (C hap te r5 )

D ata C o llec tio n a nd A na lys is T oo ls S e lec tio n, Insta ll, T e st (C ha pte r 6)

N e tw or k O p tim iz a tio n S in gle C e ll F un ctio na l T e s t (C hap te r 7) In itia l C o vera ge T es t (C ha pter 8 ) S yste m O ptim iz atio n a nd D eta ile d P ro b le m R es o lu tio n (C ha pter 9 ) F in a l C o ve ra ge S urve y a nd W arra nty T e s tin g (C hap te r 10 )

S ys te m O pe ra tio ns (C ha p ter 1 1)

C o m m e rc ia l S e rv ic e : N e tw o rk P e rfo rm a nce M o n ito ring a n d E xp a ns io n (C hap te r 11 )

Figure 6.1-1: Relationship of Tools Selection, Installation and Test Activity to Entire Optimization Process

87

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test

8hyy9rhvy Srpq

Tr8hq sqvyhvth

Uurrqhhihryuyq irrqqvtvvvhyHD7 prhvhqsqhr urrhsr

(0;
QH9hhqvyh

9hhihrUy) 8qAhyp

0,% 60$3

&%6&
;& 00 20&5

pvriihrq
QH9hh

rshprr

89G
89G

QQprvt SAQrshpr BhuvtUy TrpvhyQr PvvhvUy) 8PHQ6TPQ6T"! TPT7hqvWvr 688@QUTuvr yy$Qvy 6hyr

6hyv Uy

%76
BQT

%76

8hyyQprvt T86Q9rit rvqrHH Uurryhrvphyy rqvursvrqyr hqprpvhyhtrs

0RELOH 'LDJQRVWLF 0RQLWRU

6vvrshpr

rxyvsrppyrD hqqvv89Guyqir rqsiyruvt

HT

hhvrvurrx

3LORW 6FDQQHU

yvsrppyr

BQT

Hsurryhrurhvyrqqvtvvvhy vvhvhtrsurrxyvsrppyr

Figure 6.1-2: Tools Overview

88

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test

6.2

Tools Required

To evaluate and select CDMA optimization tools requires a minimal set of office tools as listed in Table 6.2-1. Tool Name Computer with printer Access to the Internet, Motorola Intranet, and e-mail Word processing application and spreadsheet software A folder or binder Description and Vendor Any Any Recommended Quantity Market dependent Market dependent

(examples: Word, WordPerfect, Excel, or FrameMaker) All relevant notes will be placed along in it with the vendor documentation for the final assessment.

Market dependent

Market dependent

Table 6.2-1: Tools Required To Conduct CDMA Optimization Tools Survey

6.3 Personnel Required


Type RF System Engineer Market Manager Skill Level Blue Belt (See Appendix A) The ability to appropriate the necessary resources, so that the tools can be acquired in a timely manner, to meet the required engineering needs for the market. Table 6.3-1 Personnel Required

This selection process and evaluation is highly dependent upon response times from various vendors. Hopefully this should take one to two weeks for the RF system engineer. The market manager will then be required to review the recommendations and procure the necessary equipment. Following that, the tools will need to be installed and tested by the engineering (team).

89

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test

6.4

Entrance Criteria

There are no specific entrance criteria for starting research of the tools.

6.5

Procedure

The procedure for selection of tools is fairly straightforward at a high level and consists of the following four steps: 1. 2. 3. 4. Evaluate market requirements Tools research and evaluation Tools procurement Tools installation and test

A detailed technical evaluation is required to ensure that the tools will meet the specific job requirements. Many tools will be selected because there are no alternatives. Where alternatives exist, the best selection should be made. The activities listed above are discussed below. 6.5.1 Evaluate Market Requirements The following issues should be considered by the RF network optimization engineering team and market manager to identify specific tool requirements for the optimization activity: 1. What are the requirements of the data collection and analysis tools for verifying that Motorola has fulfilled its contractual obligations of RF performance (before handing over the CDMA system to the customer)? What types of reports must be supplied to show satisfactory compliance to contract warranties? What tools will be required after the system reaches commercial status? 2. What is the schedule for completing the RF network optimization? How large is the engineering team that will be sharing these tools? How many types of each tools are needed? How many teams will be working in parallel (each requiring a set of tools)? 3. Are there any market specific deployment issues requiring support from prospective tool vendors to be able to log/convert/process data that will be relevant for any new features in the deployment? Are the tools current or outdated? 4. To what extent and how thorough will the data collection and documentation need to be? [Certain tools offering cost savings may sacrifice quality in some areas.]

90

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test 6.5.2 Tools Research and Evaluations

From the requirements derived from the exercise above and the list of candidate tools found in Appendix 6A, generate a customized candidate list of tools that will be considered for procurement. The following tasks should be required during this research activity: 1. Contact vendors to request updated specifications. Pay particular attention to data compatibility. Use the phone, fax, or Internet to gather data. 2. Request demonstrations when possible especially on newly advertised tools and features. 3. Gather tool pricing data. Quotes may have to be requested from vendors sales departments; these people may be separate from the technical contacts. 4. Summarize the information gathered on a spreadsheet listing the pros and cons of each tool. For example, Table 6.5.2-1 is a snapshot of a spreadsheet that was put together for the Japan market containing high level pros and cons. 5. Can tools be borrowed from other Motorola account teams that may not be using them at this time? If so which tools.

91

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test Cost/ Availability Multiple phones can be logged at Additional hardware $XXXXX once. needed. Can not plug a per unit mobile directly into PC. Hardware can contain a Pilot scanner Not yet compatible with Available on Grayson as well as mobile ports. JCDMA band. MM/DD/YY Surveyor Poor data logging procedure, only logs when event "triggers" logging. Walkabout is designed for inLog files only $XXXXX building coverage testing. compatible with per unit OPAS32 postSafco WinDM / processing tool. Walkabout Windows configuration makes real- Difficult for nonAvailable on time troubleshooting very easy. English speaker to use. MM/DD/YY Very "menu intensive". Easy to setup, easy operation. Log files are compatible with all post processing applications, including Qualcomm Compas. CAIT Only software to install, customer * Will need to add can use existing laptops* and GPS Win95/98 OS to NT setups. computers. Table 6.5.2-1: Sample Tools Evaluation Spreadsheet 6.5.3 Tools Procurement Candidate Tool Pros Cons

Once the tools have been identified and selected, the recommendations, including vendor part numbers, descriptions, quantities, pricing, and vendor contact information should be entered on the purchase requisition form. This form must be submitted to appropriate management for signature (program manager, finance), then forwarded to purchasing to place orders. Due dates are critical. 6.5.4 Tools Installation and Test Once the tools have been received, then they should be installed and tested on the actual system as it nears readiness for Single Cell Functional Test (SCFT). Engineers should refer to the documentation that comes with the individual tools. References are given in Appendix 6A for the Motorola tools.

6.6

Analysis Conducted

Tradeoff analysis should included the following items:

92

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test 1. Cost. This should not only take into consideration the expense of the tool, but also processing time required by the tool, the configuration time, and what will be the learning curve on using the tool. 2. Tool Features. Is it compatible with other tools? Does it do everything you want it to do? Does the tool have too many options that it becomes user-unfriendly and requires the user to do multiple steps to complete a single task? 3. Vendor reliability. This should include the cost for the vendor support packages. 4. Tool availability. What will be the lead times? Does the vendor have a working viable solution today or is the vendor promising something, in the way of a feature, which they will be delivering in the near future.

6.7

Exit Criteria
1. Tools have been evaluated and selections complete. 2. Tools have been purchased and delivered. 3. Tools have been installed and tested.

93

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test

Appendix 6A: Tools References


Information in table 6A-1 below is a sample listing tools that will be required for the RF optimization of a CDMA system. The first set of tools contained in the table below are Motorola tools. The second sets of tools are third party tools that can be procured from outside of Motorola. [Note: It is important to realize that this list is only a snapshot at publication time; one should do additional research into recent product offerings prior to final selection and purchase.] Table 6A: Motorola Developed Tools & Products Tool Name CAMPS Operates On Collects Mobile Phone Data and GPS Position Data SMAP Data collected from LAN DM Data and SMAP Data HP Pilot Scanner Data Call Detail Logs Reference http://scwww.cig.mot.com/SC/mgmt/tools/CDMA/T est_Tools/CAMPS/index.html

SMAP

http://scwww.cig.mot.com/tools/smap/ http://scwww.cig.mot.com/~thakkar/smap.html http://www.sesd.cig.mot.com/compas/ http://engdb.tlv.cig.mot.com/tools/PilotAnalyzer.ht ml http://www.cig.mot.com/~wheelrts/analyzer.html http://www.cig.mot.com/ted/EXT_WEB/TED/pdf/e nglish/R7pdf_nof/226A24GO/226A24GO.PDF http://www.rochellepark.pamd.cig.mot.com/softwar e.html http://www.cig.mot.com/ted/EXT_WEB/TED/pdf/e nglish/R7pdf_nof/226A24GO/226A24GO.PDF http://www.cig.mot.com/ted/EXT_WEB/TED/pdf/e nglish/R7pdf_nof/226A24GO/226A24GO.PDF http://www.rochellepark.pamd.cig.mot.com/~dhelm/ omcr.htm

COMPAS Pilot Analyzer

CDL Analysis Tool (CAT)

PM SUM, PMTRAF, PMMCC, CEM

PM Data

SCUNO

Show or Compare All Parameters Script

System Event Logs and Alarm Data MIB

94

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test Neighbor List Tools: sho_time (generated neighbor list (NL) recommendations from DM data or HP scanner data) nlp_pl5 (generates NL recommendations from CDLs)

Processed COMPAS Data

http://www.cig.mot.com/~spresney/sho_time/sho_ti me.html

Processed CDLs

http://www.cig.mot.com/~reimrsrr/NLP.html

In addition to the Motorola tools previously listed, the following 3rd party tools may be considered as candidates for various optimization activities: Field Data Collection Tools 1) Mobile Diagnostic Monitors
QUALCOMM CAIT (Windows) Software Package 6455 Lusk Blvd, San Diego, CA 92121 Phone (800) 349-4474 Fax (619) 658-2567

http://www.qualcomm.com/cdma/infrastructure/ancillary/ For a listing of CDMA licensed suppliers from QUALCOMM go to http://www.qualcomm.com/cdma/tech/license.shtml.


Grayson Wireless Illuminator 140 Vista Centre Drive, Forest Virginia 24551 USA Phone 800-8007465 or 804-386-5300 Fax 804-386-5324 www.grayson.com or

http://www.grayson.com/contactus.html
SAFCO Technologies, Inc. (WALKABOUT, SMRTSAM2/PLUS, and PROMAS32) 6060 Northwest Highway, Chicago, IL 60631-2518 Tel: 1-800-843-7558 (toll-free in USA; press 1 for Support, press 3 for Sales) Tel: 1-773-631-6216; Fax: 1-773-631-1626; Sales: 1-773-467-2707 Product Support: 1-800-544-1431 or 1-773-467-2706 email: sales@safco.com Support Email: support@safco.com

http://www.safco.com/products.html
ROHDE&SCHWARZ Inc. 4425 Nicole Drive; Lanham, MD 20706 Tel. (301) 459-8800; Fax (301) 459-2810

http://www.rsd.de/produkt/tm_mob.htm or http://www.rsd.de/produkt/215a.htm

95

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test 2) CDMA pilot scanners and/or spectrum analyzers: Hewlett Packard Pilot Scanner HP E7472A CDMA Integrated RF and Call Performance Coverage Test System http://www.tmo.hp.com/tmo/datasheets/English/HPE7472A.html http://www.hp.com/go/drive_test
Grayson Wireless PN Scanner Grayson Wireless Analyzer 140 Vista Centre Drive, Forest Virginia 24551 USA Phone 800-8007465 or 804-386-5300 Fax 804-386-5324 www.grayson.com or

http://www.grayson.com/contactus.html Berkeley Varitronics Systems 255 Liberty Street Metuchen, NJ 08840 (732) 548-3737 Fax (732) 548-3404 http://www.bvsystems.com/Products/CDMA/cdma.html LCC International, Inc 7925 Jones Branch Drive McLean, Va 22102, USA (703) 873-2000 http://www.lcc.com/ http://www.lcc.com/whaznew/Newsletter/OptimEyes_Trblsht/bod y_optimeyes_trblsht.html
ANRITSU COMPANY (Radio Communication Analyzer) 1155 East Collins Boulevard Richardson , TX 75081 1-800-ANRITSU (800-267-4878); Tel: 972-644-1777; Fax: 972-6443416 Email: moreinfo@naro.us.anritsu.com

http://www.global.anritsu.com/products/test/rfmicrowireless/MT88 02A.html

96

Optimization Procedures and Guidelines, Ver 1.2 Tools Selection, Installation, and Test 3) Post-Processing Tools
OPAS32 (SAFCO Technologies, Inc.) 6060 Northwest Highway, Chicago, IL 60631-2518 Tel: 1-800-843-7558 (toll-free in USA; press 1 for Support, press 3 for Sales) Tel: 1-773-631-6216; Fax: 1-773-631-1626; Sales: 1-773-467-2707 Product Support: 1-800-544-1431 or 1-773-467-2706 email: sales@safco.com Support Email: support@safco.com

http://www.safco.com/products.html
Grayson Wireless Analyzer 140 Vista Centre Drive, Forest Virginia 24551 USA Phone 800-8007465 or 804-386-5300 Fax 804-386-5324 www.grayson.com or

http://www.grayson.com/contactus.html

97

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT)

7.0 Single Cell Functional Test (SCFT)


7.1 Description

The purpose of the Single Cell Functional Test (SCFT) is to ensure basic functionality and operation of each cell, and to identify and resolve any remaining hardware or software issues. Single cell functional testing is required at each site in a cluster and the first tier sites surrounding that cluster prior to performing the initial coverage survey for a given cluster. SCFT schedule should be arranged such that an entire cluster is ready for optimization before moving on to the next cluster. The relationship of the SCFT to the rest of the optimization cycle is shown in Figure 7.1-1.

Optimization Preparation
Equipment Installation and Test Verification (Chapter 3)

Network Design Verification (Chapter 2)


Accurate Terrain, Clutter, Xlos Tuning Data System Design via NetPlan/CSSS

RF Parameters Database Verification (Chapter 4)

Spectrum Clearing (Chapter5)

Data Collection and Analysis Tools Selection, Install, Test (Chapter 6)

Network Optimization Single Cell Functional Test (Chapter 7) Initial Coverage Test (Chapter 8) System Optimization and Detailed Problem Resolution (Chapter 9) Final Coverage Survey and Warranty Testing (Chapter 10)

System Operations (Chapter 11)

Commercial Service: Network Performance Monitoring and Expansion (Chapter 11)

Figure 7.1-1: Relationship of Single Cell Functional Test Activity to Entire Optimization Process The Single Cell Functional Test will ensure: The cell site hardware is functional The connection between the site and the CBSC is functional 98

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) The ability to place Land to Mobile calls and Mobile to Land calls on each sector The antennae are pointing in the correct directions and are broadcasting the correct PN The PN footprint plots show adequate mobile receive and Ec/Io levels Softer & soft handoffs occur. Softer: handoff between sectors of the same cell site. Soft: handoff between two different cell sites.

99

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT)

7.2

Tools Required
Item Description or Vendor Laptop Computer with CDMA testing software. Candidates: Motorola CAMPS Qualcomm MDM or CAIT Safco Walkabout Or equivalent. Trimble Placer GPS or equivalent. Qualcomm, Sony, Motorola, Toshiba, Panasonic, or equivalent compatible with market frequency bands. COMPAS, OPAS, or equivalent (See Chapter 6 Tools Selection) Recommended Quantity

Diagnostic Monitor (DM)

2 per vehicle

Global Positioning System (GPS)

Required to supply time/location data for each DM 1 per DM

CDMA Phone Drive Test Data Post Processing Tool(s) & RF performance verification tool

Dependent on size of market Dependent on market configuration

CDLs and CDL processing tool Cdl_browse, CDL Analyzer Table 7.2-1: Tools Required

7.3

Personnel Required
Skill Level White Belt ( See Appendix A) Good computer background, capable of operating DM & mobile. Good penmanship. See appendix A.4 Valid drivers license. See A.6 Knows CBSC operations. See appendix A.8 Can talk on the phone and fill out test forms. Table 7.3-1: Personnel Required

Position RF Optimization Engineer Data Collector/ Field Engineer Driver CBSC Engineer Landline Operator

7.4
1. 2. 3. 4. 5.

Entrance Criteria
Network Design and Optimization Preparation steps must be successfully completed. The spectrum should be clear of all unauthorized users. BTS must be completely installed, ATPd and OPTOd. The database must be complete and checked against design and default values. All necessary tools must be installed and operational. This may include setting up PN mapping files for use by the DM display. Notes on Entrance Criteria

7.4.1

In an ideal situation, all sites in a cluster should be installed, ATPd and OPTOd prior to starting Single Cell Function Testing. Having all neighboring sites ready when doing the SCFT will allow verification of soft handoff functionality. Having all sites up will also 100

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) give more realistic detail in viewing the RF propagation of each sector. Due to cell site, BTS link, and database install scheduling, having all sites ready at one time may not be possible. If a new site needs to be function tested before its neighbors are ready, the engineer must setup further plans for soft handoff testing after the neighboring sites are up. The SCFT procedure was written from the perspective that at least one tier of neighboring sites is operational while performing the function test. Again the point of the SCFT is to test the functionality of the new BTS, one of those functions being handoff capability with its neighbors.

7.5

Procedure

This procedure covers data collection and subsequent analysis required for SCFT. Two data collection / analysis methods are proposed. Method 1 assumes that data is collected in the field by less skilled personnel (as compared to an engineer) and returned to the field office for evaluation by the engineering team. Method 2 assumes a more skilled data collector or engineer is dispatched to the field to do real-time SCFT verification. For this method the data collectors or system engineers will be adequately versed in the usage of the DM. In both methods, the data should be post-processed to guarantee that all SCFT requirements are met, therefore discussion emphasizes analysis by the engineering team. (Conversely, there still may be specific problems encountered in the field that require the engineers to perform drive test investigations themselves.) In both methods, the basic flow of this activity involves: - definition of drive test routes - pre-departure equipment testing - data collection - origination termination tests along metric routes - hardware testing (if not completed by CFEs) - continuous call or Markov data collection along metric routes - data processing (mobile and CDLs) - data analysis (mobile and CDLs) Data collected includes DM data and CDLs. Each type of data plays a different role in the SCFT activity as shown in Table 7.5-1.

101

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) Data Source Mobile DM Markov or Continuous Call Data SMAP (optional) Data Used To Verify Antennas pointing in correct directions and antenna cabling is correct PNs broadcast on correct antennas Adequate Mob Rx levels on each sector Adequate Ec/Io levels for each PN Soft and softer handoff capability (1) Verifying M L and L M calls for each sector Soft and softer handoff capability (1) Analysis Tool Used Post-processing tool such as COMPAS, OPAS or equivalent

CDLs for call sampling CDL Analysis Tool (originations and terminations) ESNs used during drive test Table 7.5-1: Tools Required For SCFT Data Analysis (1) Usage of only one set of data and accompanying tool is sufficient to validate soft/softer handoffs. Both techniques are discussed for reference. 7.5.1 Drive Route

7.5.1.1 Drive Route Definition The drive route definition will depend upon which method is used to collect and analyze data. Both methods are discussed below. 7.5.1.1.1 Drive Route Definition Method 1 Determine the number of cells in each cluster. The customer and Motorola should agree on this. A cluster normally consists of approximately 10 to 15 contiguous cell sites or one CBSC. This number may vary depending on the size of the system. Once a cluster has been defined identify the first tier sites around that cluster, which must undergo SCFT and be operational prior to any initial coverage testing for that cluster. Next, create a drive route that assimilates a perpendicular X/Y crosshatch pattern around each site undergoing SCFT. The routes should extend approximately two-thirds of the distance from the site undergoing SCFT to the first tier of neighboring sites in all directions (this should enable soft handoff to these neighbor sites). The drive route for each site should also traverse through all sectors of that site, passing through sector boundaries (this should allow the mobile to enter softer handoff). Where possible, emphasize major streets, roads or highways to expedite the drive data collection. A sample of a crosshatch pattern consisting of perpendicular, East/West and North/South streets is shown in Figure 7.5.1.1-1. [Note: Careful development of SCFT drive routes can save time during the data collection. Engineers should brief data collection teams on the simple rules above in case obstacles are encountered along the drive route. If any highways or Interstates are to be driven, make sure the access points are at available entrance ramps. Also, take note of one-way streets or dead end roads, it is important for the drive route to flow continuously. Be familiar with the area to be driven.]

102

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT)

The X in the center represents the site undergoing SCFT. Ys are neighboring sites. Figure 7.5.1.1-1: Sample SCFT drive route map for Method 1. 7.5.1.1.2 Drive Route Definition Method 2 The drive route used when performing real time SCFT is a circular route. The drive route begins in the first sector of the site undergoing SCFT and continues in a circle pattern to include all other sectors of the site. For example in a 3 sector system the drive tester would begin at a start/ stop point that is identified in sector one , see point A in example 7.5.1.1-2. From point A the drive team should proceed to point B in sector 2 and begin testing, when testing is complete in sector two the team should proceed to point C in sector 3 and begin necessary testing, upon completion of sector 3 testing the drive team should proceed back to point A. For softer testing the drive team should ensure that they are within 2-3 blocks of the site being tested to ensure that the mobile is only using the tested site.

103

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) Figure 7.5.1.1-2: Sample SCFT drive route map for Method 2 7.5.1.1.3 Soft Handoff Drive Route Definition If the SCFT site has no neighbors, or its neighbors have not been function tested, it is recommended to delay this test until all sites in the cluster are ready. For Soft handoff testing for both method 1 and method 2, the drive team should begin at the BTS (point 1) and drive away from the site, remaining inside the coverage area of that sector. When the PN for the adjacent site is the best active (point 2) the team should turn around and drive back to the starting point to ensure proper handoff back into the site being tested. See figure 7.5.1.1-3 for an illustration of the soft handoff drive route.

Figure 7.5.1.1-3: Sample Soft Handoff drive route map

104

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) 7.5.2 Pre-Departure Equipment Test

Test all equipment before leaving the facility. Minimally7, this should include verification of: 1 CDMA mobile phone (for call sampling) 1 CDMA mobile phone connected to: 1 Laptop computer loaded with DM software, connected to: 1 GPS receiver (to log time and position data) Figure 7.5.2-1 shows a sample drive test van equipment configuration block diagram; variations will exist for each suppliers equipment. All equipment should be made functional once connected.

DC Power From Vehicle


12V

AC/DC Inverter
DC AC

Cigarette Lighter GPS Antenna Coax Cable Data Pwr Ant Port GPS Unit GPS Power Serial Link Com Pwr Serial CDMA Phone

Laptop/ Diagnostic Monitor

Figure 7.5. 2-1: Block Diagram of a Typical CDMA Drive Test Van Setup. Drive test equipment troubleshooting checklist: Are all components powered and turned on? Have all cables and connectors been tested? Is the cabling correct? (correct GPS port, correct ports on the PC) Does the PC meet the minimum system requirements for the DM software? Does the PC have all necessary drivers installed with no resource conflicts? Is each component functional independently? (DM works with no phone or GPS plugged in? Mobile is functional?) Are all configuration settings correct as specified by the components manufacturer? (GPS settings like data transfer rate and parity.) Are all accessory devices attached and functioning? (Mobile antenna, attenuator)

If desired, a DM can be connected to the second phone. This is sometimes useful to confirm performance of network phones, both of which would show the same performance in the same area.

105

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) 7.5.3 Data Collection Activity

Two phones will be used for SCFT data collection. One phone will perform either Markov or continuous call data collection. This phone will log RF performance data connected to the DM for the duration of the test. The second phone will perform call sampling to facilitate origination and termination testing. The procedures for each type of data collection are discussed below. The drive team should go to the start of the drive route, start making both types of calls while logging data and documenting the results. The route should be completed as efficiently as possible. Any problems, such as excessive call setup failures or dropped calls should be called into the engineer overseeing this data collection activity. (Contingencies must be addressed by the lead engineer as appropriate. A line of communication to qualified personnel at the CBSC is required to facilitate any troubleshooting activities.) An example of a SCFT data sheet is included in Appendix E. This should be used to track completion SCFT on all sectors in a cluster. 7.5.3.1 Markov or Continuous Call Data Collection Procedure The preferred type of call to characterize system performance around a cell site or in a cluster is a Markov call. Markov calls transmit prescribed distributions of full, half, quarter, and eighth-rate frames on both the forward and reverse links as long as the call is active. This allows the engineering team a fair calculation of frame erasure rates (FER), (typically for full-rate frames only) on both links. Markov calls are pegged to specific, designated transcoder circuits at the CBSC. Markov calls are silent, i.e. no noise is heard on either link of this call. Some precautions must be taken before using a Markov call. First, ensure that the phone being used is capable of Markov calls for the rate set being used by the system. Second, if the cell site is near an inter-CBSC border, and ICBSC-SHO is enabled, since ICBSCSHO or anchor handoffs are not supported for Markov calls, then data collection should be conducted using a regular, (non-Markov), but still continuous call. A continuous call can be set up to a dedicated circuit, where a radio station can be re-broadcast. (The drawback with having the operator just listen to the radio station to grade forward link audio quality is that there will not be any activity on the reverse link.) The test is conducted by placing a call to (in order of preference) the Markov circuit phone number, or a source of continuous audio (such as the re-broadcast radio station), or a land-line party (only one is required). The DM should be set up to record data prior to making each call to ensure that the origination sequence information is logged at the start of the call. Each test call should be kept active for a maximum of 30 minutes or until a drop (RF Loss) occurs, whichever comes first. (This precaution attempts to minimize any lost field data due to equipment, connection, or phone failures in the field.) The file is then saved. After each call is saved, the DM recording should be reinstated and the next call set up as quickly as possible to minimize data collection holes. Appendix 7A contains a sample Markov or continuous call log sheet. This form should be modified to meet the requirements of each market. All calls should have their start time and location documented. In addition, if a drop occurs, document the drop location and time. Refer to specific phone and DM user documentation to program the phone and/or DM for Markov operation. 7.5.3.2 Origination/ Termination Test Procedure 106 Two methods of conducting origination/termination testing are discussed below.

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) Method 1: The RF Engineer will use the CDLs in order to verify call origination and terminations on each sector of the new site. A DM is not required, but is useful to help investigate any problem(s) encountered with the Markov/continuous call. If available, it should be used. If equipment resources are thin, then just a phone will suffice. The DM operator should make three Mobile-to-Land calls and have the land operator then make one Land-to-Mobile call. Call duration should be 10 seconds long with a 10second interval between calls. This quick call approach should produce an adequate number of Call Detail Log (CDLs) to be generated for this test ESN. This process should be repeated throughout the entire drive route. The tester will not have time to document all details for each call. However, in case of a call setup failure or call drop, the most important details to record are the failure time and location. See Appendix 7B for drive test log sheets. Method 2: Is primarily for the experienced engineer or an experienced DM operator who goes into the field and does verification real time. Method 2 is not recommended for a 6 sector system because it is very difficult to identify the direction of sectors due to overlap. Start at point A, make an origination, keep this call up while driving along the circular route explained in section 7.5.1.1.2, while keeping the call up, record the following data; the desired PN and actual PN, Ec/Io, Mobile Receive and Mobile Transmit Values. Also, enter the location where the measurements are taken. If the dominant PN does not match the expected PN, indicate this problem in the Comments section of the SCFT Form. If there is more than one dominant offset (Ec/Io values within 3 dB), indicate this in the same section. End the call. To verify terminations have the switch call the mobile. This is defined as a mobile terminated call. Repeat the origination/termination process for all sectors. The DM should be logging data to a file for this test as a precautionary measure. Post processing and analysis of collected data is not necessary unless a problem is identified. If problems are identified while testing according to method 2 revert back to method 1 for data processing and analyzing. 7.5.3.3 Hardware Testing The BTS hardware and cabling are typically tested by CFEs during the ATP (Acceptance Test Procedure) and OPTO (hardware optimization) see Chapter 3 Equipment Installation and Test. The hardware tested should include the BBX, CSM, and MCCs. Performing the origination test on each sector will complete BBX verification. If the correct PN appears and the tester can originate calls on that sector, then the BBX is functional. CSM functionality is verified by doing soft handoffs from the new BTS to its neighbors. See 7.5.3.4 Soft/Softer Handoff Verification. MCC functional testing is done by placing 15 second calls from any one sector of the new BTS. Verify that calls are placed on all channel elements for all MCC cards. Note that there are two overhead channel elements (paging & access) per sector that will not carry voice traffic. The specific channel elements being accessed can be monitored from the CBSC using call-proc or by tailing the test phone. If a failure occurs on a specific MCC card and channel element, the SNAP command should be used to see if the failure re-occurs on the same channel element. If a channel element 107

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) is found to repeatedly cause failures, that element should be disabled (OOS_MAN) and the card should be replaced. Table 7.5.3.3-1 shows an MCC checklist for a three sector system with 4 MCC-16 MAWI cards where the cells for paging and registration channels are black. In this instance 58 calls must be placed to access all channel elements.
CE 1 CE 2 CE 3 CE 4 CE 5 CE 6 CE 7 CE 8 CE 9 CE 10 CE 11 CE 12 CE 13 CE 14 CE 15 CE 16

Mawi 1 Mawi 2 Mawi 3 Mawi 4 Table 7.5.3.3-1 Channel Verification for 3-sector MCC 16. 7.5.3.4 Soft/Softer Handoff Verification

Softer handoffs can be verified between origination/termination test between sectors. After completing the termination test for a sector, originate a call, keep the call up and proceed to the start/stop point for adjacent sector verifying that a handoff occurs from the starting sector to the next sector. If a handoff is verified, enter pass on the SCFT Form. If a handoff is not verified, enter fail. Upon arriving at the destination sector, end the call. Do the origination/termination testing for the second sector then repeat the above softer handoff process for all sectors. To test soft handoffs see the soft handoff drive route definition in section 7.5.1.1.3. If a handoff is verified, enter pass on the SCFT Form. If a handoff is not verified, enter fail and continue with the process. The SCFT form referenced in this section has been included in Appendix 7D. Please modify the forms according to the number of sectors in each site. 7.5.3.5 Receive Diversity Testing (optional) This test will help verify proper cabling and antenna configuration for a new BTS. The tools required are: Portable transmitter (phone capable of transmitting at 23dBm) SMAP The test should be performed in each sector. From the start point of one sector (See point A in figure 7.5.1.1-2) the portable transmitter should be keyed up. With SMAP, record each sectors main and diversity RSSI levels. Verify the two strongest receive levels are from the receive paths associated with the specific sector being tested. If the strongest receive paths are not from the test sector there may be a problem with antenna configuration or cabling.

108

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) 7.5.3.5.1 SMAP Procedure for Receive Diversity Testing SMAP provides both main and diversity RSSI data for all the three/six sectors. This helps to ensure that the spectrum is clear of any unwanted noise on the CDMA band before putting any commercial traffic on the system. It also helps to determine if main and diversity antennas are connected properly. An example of problem solved is that inadvertently many times during installation cables are swapped and sometimes two diversity antennas are swapped. Although system engineers can indicate that main Rx antennas are swapped by looking at a PN value, they cannot confirm if diversity antennas are swapped without using SMAP because there is no such tool that can show this data. The user needs to turn on RSSI filters for desired sectors. This can be done by just placing a bts and sector number in the appropriate fields and by turning on appropriate Sector Data Log (SDL) message. Use SDL2 filter for single carrier systems. Select BTS Status Display under the monitoring display menu. Enter the correct CBSC, BTS, and sector values. This screen shows a graph where RSSI values are plotted. Theyre also displayed in numeric format on the right. The goal is to identify the sector the mobile is in using RSSI values. Youre looking for RSSI values higher for both main and diversity antennas for that sector. The received value depends on how strong the mobile is transmitting and how far it is from the antennas. It is important to transmit a strong signal to distinguish the sector youre transmitting from on SMAP. The best possible solution is to transmit at 23dbm on the reverse link using a modified Qualcomm phone. For example, cutting a particular resistor on the receiver circuit can modify the QCP-1900. This gives you a delta of about 10-20db and identifies the sector youre in very clearly. If this is not possible, try to be about mile away from the site, which should give you about 2-3 dB delta between sector youre on and other sector. 7.5.4. Mobile and CDL Data Processing There are many tools available to process mobile data and many different ways to use each of those tools. This procedure will use Motorola tools as examples to show how to process mobile data. CDL data processing is very specific and only the CDL Analysis Tool is used to process CDLs. 7.5.4.1 Mobile Data Processing Once the mobile data is collected and transferred to the computer system and directory designated for data processing, the data can be processed using COMPAS. In order to learn more about COMPAS see the TED (Technical Education & Documentation) web page. The document number is COMPAS3.2 68P09248A05-A The two required outputs from the COMPAS-processed data to perform SCFT analysis are mobile messaging (finger lock information) and time / location data. Mobile Messaging Data A sample of a comma-delimited, mobile message containing information indicating finger locks in a COMPAS-produced MOBILE (.mob) file is shown here: 109

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) Time, Date, State, IS-95 message, Dir, AckSeq, MsgSeq, AckReq, Decoded Contents 13:58:28:253, OCT 20 1998, HANDOFF, PSMM, <--, 4, 3, 1, ENCRYPTION=0, REF_PN=231, PILOT_STRENGTH=-2.500000, KEEP=1, PILOT_PN_PHASE=228, PILOT_STRENGTH=-31.500000, KEEP=0 This sample shows that the mobile is locked on to reference PN 231, and is also requesting to drop pilot PN228. Time / Location Data Time and location data are contained in the COMPAS-produced, comma-delimited, TIME (.tim) file. A sample header for a TIME file is shown in Appendix 7C. The header shows the order of the collected and post-processed data for each entry in the TIME file. Definitions of each of these data fields can be found in the COMPAS product documentation. After processing the drive test data through COMPAS, these *.mob and *.tim files will be found in the following directory structures of the analysis where COMPAS has been run: /analysis/COMPAS/TIME/*.tim /analysis/COMPAS/DECODE/*.mob 7.5.4.2 CDL Data Processing CDL data should be transferred from the OMC-R to an off-line analysis workstation. (Avoid running the CDL Analysis Tool on the OMC-R.) The tools used to browse the CDLs are discussed in Chapter 6. The CDL file name will be cdl.<date & time>, it is best to use the tar utility in order to keep all the CDLs for a specific date together. The tar file should be compressed before being transferred. Once the binary CDLs have been transferred and processed, the output file can be greped to isolate the CDLs created by the Origination/Termination test mobile. Here is an example of how to use the UNIX grep command: Cdl_browse cdl.* | pgrep ESN=0xabf0c83 >> Test_mobile.cdl Cdl_browse cdl.* -- this simply reads through all of the cdl.<date> files in the current directory and changes them from binary to ASCII format. | pgrep ESN=0xabf0c83 pulls out the entire CDL record where ESN=0xabf0c83 is found. >> Test_mobile.cdl -- writes the grepped data to the filename Test_mobile.cdl. Two > symbols mean append to the file. This Test_mobile.cdl file will be used in the next section to verify originations and terminations.

110

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT)

7.6
7.6.1

Data Analysis Procedures


Origination / Termination Verification Methods

Three methods are presented to verify originations and termination on each sector using either CDLs or mobile (messaging) data. The primary approach is to use the bts_orig.dst file created by the CDL Analysis Tool (CAT). The secondary approach is to process the browsed CDLs through a script that could extract a list of all BTS/SECTOR upon which origination and terminations were made. Both the primary and secondary approaches take advantage of not having to use a DM connected to the call sampling phone. The tertiary approach would require use of the DM to collect mobile message files. This data would be processed and examined to extract the PN upon which the calls set up, and those PNs would be mapped to various BTS/Sectors. Each of these methods is presented below. (Finally, as an alternative, the drive test technician could write down all the PNs originated on as he/she drove around a cell site. This would required a fairly skilled technician and is not as robust as any of the approaches laid out below.) 7.6.1.1 Origination Termination Verification Using bts_orig.dst One of the simplest ways to verify originations and terminations on each sector of the new site is to process the browsed CDLs with the CDL Analysis Tool (CAT) and view the bts_orig.dst file. One very useful report produced by the CAT is the bts_orig.dst. A sample of this output report from the CAT, for BTS 51 (6-sector site), is shown here: BTS/SEC Orig/Term Rate BTS SEC Orig Term 51 1 2 1 51 2 1 1 51 3 3 2 51 4 2 1 51 5 0 1 51 6 2 2

Orig% 66.67 50 60 66.67 0 50

Term% 33.33 50 40 33.33 100 50

O_C% 11.11 5.556 16.67 11.11 0 11.11

T_C% 5.556 5.556 11.11 5.556 5.556 11.11

CCR 100 100 100 100 100 100

The headings of this report/file and all other reports/files are explained in the CDL Analyzer documentation (http://www.cig.mot.com/~wheelrts/analyzer.html). This report shows that all sectors had successful terminations. Also, all sectors except 51-5 experienced successful originations. In this case, a follow up drive test should be scheduled to verify that this sector can perform successful originations. 7.6.1.2 Origination/Termination Verification Using Browsed CDLs Looking through the ASCII CDL is another means to verify that each sector had successful originations and terminations. The best method to determine if a call is either an origination or termination is to check the Entry Type field in the CDL for that call. Table 7.6.1.2-1 lists the various entry types. ENTRY_TYPE 0 1 2 3 111 Type of Call Origination Termination CDMA to CDMA hard handoff CDMA to CDMA soft handoff

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) Table 7.6.1.2-1: Entry Type Definitions for CDLs By looking at the ACCESS_BTS and ACCESS_SECTOR fields, the CDL below shows there was a successful termination on BTS-51, Sector-5 for ESN=0x0abf0c83. (Also, for terminations, the DIALED_DIGITS= field is null.) Perusing the browsed CDLs, or writing a short script to extract the ACCESS_BTSs and Sectors for the test ESNs will give you a list of all sectors that supported successful originations and/or terminations. Handoffs can also be verified using the CDL by comparing the ACCESS_BTS field with the LAST_MAHO_ACT or LAST_SHO_BTS fields. The LAST_SHO_BTS and SECTOR fields state the last CBSC, BTS, and Sector the mobile was in soft handoff with. Figure 7.6.1.2-1: Example of a Browsed CDLLOG (Start)
BROWSE CDLLOG MSI-4406239975 99-01-13 13:08:33 omc MM-6 L000000.00000 112209/164884

CDL:1
CDL_SEQ_NUM=0x7d3d

CALL_REF_NUM=0x02b3 CBSC=6 MMADDR=0xa6 XC=1 CPP=4 MID=4406239975 ESN=0x0abf0c83 SCM=0x62006200 MOBILE_PROTOCOL_REV=3 DIALED_DIGITS= ACCESS_TIME=13:08:28 ACCESS_PN_OFFSET=21 ACCESS_STR=0x0ff7 ACCESS_CHANNEL=1120 ACCESS_BTS=51 ACCESS_SECTOR=5 ENTRY_TYPE=1 SERVICE_OPTION=0x0003 NEGOTIATED_SO=0x0003 LAST_MM_SETUP_EVENT=23 CIC_SPAN=68 CIC_SLOT=30 XCDR=0x0110 INIT_RF_CONN_BTS=51 INIT_RF_CONN_SECTOR=5 INIT_RF_CONN_MCC=22 INIT_RF_CONN_ELEMENT=19 INIT_RF_CONN_CHANNEL=1120 CFC=1

Call Detail Log LAST_RF_CONN2_SECTOR=5 LAST_RF_CONN2_SSECTOR=6 LAST_RF_CONN2_MCC=22 LAST_RF_CONN2_ELEMENT=19 MCC_RELEASE2_TIME=0x8b3b LAST_RF_HIGA2_INTERVALS=0 LAST_RF_HIGA2_BEGIN=0x0000 LAST_RF_HIGA2_END=0x0000 LAST_RF_HIGA2_COUNT=0 LAST_RF_HIGA2_TEMP=0x0000 LAST_RF_SETP2_INTERVALS=0 LAST_RF_SETP2_BEGIN=0x0000 LAST_RF_SETP2_END=0x0000 LAST_RF_SETP2_COUNT=0 LAST_RF_SETP2_TEMP=0x0000 LAST_RF_CONN1_MMADDR=0xa6 LAST_RF_CONN1_BTS=51 LAST_RF_CONN1_SECTOR=4 LAST_RF_CONN1_SSECTOR=0 LAST_RF_CONN1_MCC=22 LAST_RF_CONN1_ELEMENT=7 MCC_RELEASE1_TIME=0x8b3d LAST_RF_HIGA1_INTERVALS=0 LAST_RF_HIGA1_BEGIN=0x0000 LAST_RF_HIGA1_END=0x0000 LAST_RF_HIGA1_COUNT=0 LAST_RF_HIGA1_TEMP=0x0000 LAST_RF_SETP1_INTERVALS=0 LAST_RF_SETP1_BEGIN=0x0000 LAST_RF_SETP1_END=0x0000 112

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) RELEASE_TIME=13:08:51 XC_RELEASE_TIME=0x8b31 INIT_MM_REL_EVENT=3 ONE_PILOT_COUNT=0 TWO_PILOTS_COUNT=1 THREE_PILOTS_COUNT=1 LOC_S_ADD_COUNT=1 LOC_SR_ADD_COUNT=1 LOC_S_DROP_COUNT=0 LOC_SR_DROP_COUNT=0 EXT_S_ADD_COUNT=0 EXT_SR_ADD_COUNT=0 EXT_S_DROP_COUNT=0 EXT_SR_DROP_COUNT=0 BETTER_ACTIVE=0 LOC_S_PILOTS_REL=1 LOC_SR_PILOTS_REL=2 EXT_S_PILOTS_REL=0 EXT_SR_PILOTS_REL=0 LAST_HO_BLOCK_CAUSE=13 LAST_HO_BLOCK_TIME=13:08:39 LAST_HO_BLOCK_PN=370 ICS_BEGIN_TGT_MMADDR=0x00 ICS_BEGIN_TGT_BTS=0 ICS_BEGIN_TGT_SECTOR=0 ICS_BEGIN_SRC1_BTS=0 ICS_BEGIN_SRC1_SECTOR=0 ICS_BEGIN_SRC2_BTS=0 ICS_BEGIN_SRC2_SECTOR=0 ICS_BEGIN_TIME=0:00:00 ICS_END_TGT_MMADDR=0x00 ICS_END_TGT_BTS=0 ICS_END_TGT_SECTOR=0 ICS_END_SRC1_BTS=0 ICS_END_SRC1_SECTOR=0 ICS_END_SRC2_BTS=0 ICS_END_SRC2_SECTOR=0 ICS_END_TIME=0:00:00 ICS_COUNT=0 ICS_CBSCS=0 LAST_RF_CONN3_MMADDR=0x00 LAST_RF_CONN3_BTS=0 LAST_RF_CONN3_SECTOR=0 LAST_RF_CONN3_SSECTOR=0 LAST_RF_CONN3_MCC=0 LAST_RF_CONN3_ELEMENT=0 MCC_RELEASE3_TIME=0x0000 LAST_RF_HIGA3_INTERVALS=0 LAST_RF_SETP1_COUNT=255 LAST_RF_SETP1_TEMP=0x87e8 INIT_MAHO_TIME=0x870f INIT_MAHO_CAUSE=8 INIT_MAHO_ACT_STR=0x14 INIT_MAHO_CAND1_MMADDR=0xa6 INIT_MAHO_CAND1_BTS=51 INIT_MAHO_CAND1_SECTOR=6 INIT_MAHO_CAND1_STR=0x14 INIT_MAHO_CAND2_MMADDR=0x00 INIT_MAHO_CAND2_BTS=0 INIT_MAHO_CAND2_SECTOR=0 INIT_MAHO_CAND2_STR=0x00 INIT_MAHO_CAND3_MMADDR=0x00 INIT_MAHO_CAND3_BTS=0 INIT_MAHO_CAND3_SECTOR=0 INIT_MAHO_CAND3_STR=0x00 LAST_MAHO_TIME=0x88f9 LAST_MAHO_CAUSE=8 LAST_MAHO_ACT1_MMADDR=0xa6 LAST_MAHO_ACT1_BTS=51 LAST_MAHO_ACT1_SECTOR=5 LAST_MAHO_ACT1_STR=0x18 LAST_MAHO_ACT2_MMADDR=0xa6 LAST_MAHO_ACT2_BTS=51 LAST_MAHO_ACT2_SECTOR=4 LAST_MAHO_ACT2_STR=0x1f LAST_MAHO_ACT3_MMADDR=0xa6 LAST_MAHO_ACT3_BTS=51 LAST_MAHO_ACT3_SECTOR=6 LAST_MAHO_ACT3_STR=0x0a LAST_MAHO_CAND1_MMADDR=0xa6 LAST_MAHO_CAND1_BTS=51 LAST_MAHO_CAND1_SECTOR=3 LAST_MAHO_CAND1_STR=0x19 LAST_MAHO_CAND2_MMADDR=0x00 LAST_MAHO_CAND2_BTS=0 LAST_MAHO_CAND2_SECTOR=0 LAST_MAHO_CAND2_STR=0x00 LAST_MAHO_CAND3_MMADDR=0x00 LAST_MAHO_CAND3_BTS=0 LAST_MAHO_CAND3_SECTOR=0 LAST_MAHO_CAND3_STR=0x00 LAST_SHO_TIME=13:08:30 LAST_SHO_CAUSE=8 LAST_SHO_RESULT=1 LAST_SHO_MMADDR=0xa6 LAST_SHO_BTS=51 113

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) LAST_RF_HIGA3_BEGIN=0x0000 LAST_RF_HIGA3_END=0x0000 LAST_RF_HIGA3_COUNT=0 LAST_RF_HIGA3_TEMP=0x0000 LAST_RF_SETP3_INTERVALS=0 LAST_RF_SETP3_BEGIN=0x0000 LAST_RF_SETP3_END=0x0000 LAST_RF_SETP3_COUNT=0 LAST_RF_SETP3_TEMP=0x0000 LAST_RF_CONN2_MMADDR=0xa6 LAST_RF_CONN2_BTS=51 LAST_SHO_SECTOR=4 LAST_SHO_MCC=22 LAST_SHO_ELEMENT=7 SETUP_EVENTS=0x080c7efc FWD_QUALITY=0 LAST_FWD_INCR=0x0000 MEAS_COUNT=0 RVS_QUALITY=0 LAST_RVS_INCR=0x8080 RVS_ERASE_COUNT=5 RF_FADE_COUNT=0

End of Figure 7.6.1.2-1: Example of Browsed CDLLOG 7.6.1.3 Origination/Termination Verification Using Mobile Data Since the data collection procedure used did not call for (unless available) a DM to be connected to the call sampling phone, this is not a primary approach, but is included as an alternative. The COMPAS-produced <esn>.mob file contains the IS-95 messaging. The COMPAS-produced <esn>.tim file contains location and signal strength information (see Section 7.5.4.1). These COMPAS messages are abbreviated using acronyms. (A list of these acronyms and their meanings can be found in Appendix 7D.) Look through the mobile messages to look for an Origination Attempt message. In order to determine which PN the mobile originated on, look for the first Power Strength Measurement Message (PSMM) after the call is set up. Note this PN, then also look in the Paging Channel System Parameter Message (PCSPM) prior to the origination attempt. The Pilot PN is given. The PN found in both of these is most likely the same and is the PN used for origination. In the event of a discrepancy, the actual PN used is the one noted after the call setup. (In cases where the PNs are different in these two messages, this typically, indicates an area of many pilots or rapidly changing pilots. This should not happen too often unless the drive test vehicle is traveling at high speeds or operating in a region where many pilots are present.) Example of First PSMM Showing Reference PN on Which Call Was Set Up:
11:57:33:161, SEP 29 1998, CALL_UP, PSMM, <--, 0, 0, 1, ENCRYPTION=0, REF_PN=362, PILOT_STRENGTH=-7.500000, KEEP=1, PILOT_PN_PHASE=382, PILOT_STRENGTH=11.000000, KEEP=1, PILOT_PN_PHASE=366, PILOT_STRENGTH=-14.000000, KEEP=1, PILOT_PN_PHASE=372, PILOT_STRENGTH=-10.500000, KEEP=1

Accompanying Example of Paging Channel System Parameter Message (PCSPM):


11:57:28:546, SEP 29 1998, CALL_DOWN, PCSPM, P, -, -, -, PILOT_PN=362, CONFIG_MSG_SEQ=1, SID=12336, NID=1, REG_ZONE=21, TOTAL_ZONES=1, ZONE_TIMER=0, MULT_SIDS=0, MULT_NIDS=0, BASE_ID=3537, BASE_CLASS=0, PAGE_CHAN=1, MAX_SLOT_CYCLE_INDEX=2, HOME_REG=1, FOR_SID_REG=1, FOR_NID_REG=1, POWER_UP_REG=0, POWER_DOWN_REG=0, PARAMETER_REG=0, REG_PRD=0, BASE_LAT=0x07A41C, BASE_LONG=0x1DBFB4, REG_DIST=0, SRCH_WIN_A=6, SRCH_WIN_N=8, SRCH_WIN_R=9, NGHBR_MAX_AGE=0, PWR_REP_THRESH=2, POWER_REP_FRAMES=9, PWR_THRESH_ENABLE=1,

114

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT)
PWR_PERIOD_ENABLE=0, PWR_REP_DELAY=1, RESCAN=0, T_ADD=28, T_DROP=32, T_COMP=8, T_TDROP=3, EXT_SYS_PARAMETER=1, EXT_NGHBR_LIST=1, GLOBAL_REDIRECT=0

If the call was a termination attempt, this can also be confirmed by searching for a Paging Channel General Page Message (PCGPM) which will be followed by the first PSMM as described above. Call processing sequences, including call set ups for both originations and terminations are described in the Motorola Confidential and Proprietary document, Call Processing SFS. Excerpts of this document are described at: http://www.cig.mot.com/~klnknbrg/r5cfcdocument.html. 7.6.2 Soft/softer Handoff Verification Methods Two methods are discussed to verify soft and softer handoff instances for each site/sector. These methods use either mobile messaging data or CDLs as discussed below. 7.6.2.1 Soft/Softer Handoff Verification Using Mobile Messaging The best approach, getting the most bang for the buck to identify occurrences of soft and softer handoff would be usage of mobile messaging data. Throughout the duration of a Markov or continuous call, there are likely a large number of combinations of soft and softer handoffs that the mobile encounters. To identify which PNs the mobile used for soft/softer handoff for the entire duration of the call, the mobile messaging would need to be examined. Soft/softer handoffs can be tallied by looking at all the Handoff Completion Messages (HCM). These HCM messages will show the various combinations of PNs that were in soft/softer handoff together during the drive. A sample HCM is shown below. This HCM indicates that the mobile has entered soft handoff with PNs 362 and 372. 11:57:34:119, SEP 29 1998, HANDOFF, HCM, <--, 2, 3, 1, ENCRYPTION=0, LAST_HDM_SEQ=0, PILOT_PN=362, PILOT_PN=372 A simple script would be required to parse through the mobile messaging of all files collected along a SCFT drive route to identify all combinations of soft and softer handoffs. Then a simple checking routine could identify if all sectors entered into softer handoff with same site sectors and into soft handoff with neighboring sites/sectors. 7.6.2.2 Soft/Softer Handoff Verification Using CDLs CDLs can also be used to verify soft and softer handoffs. A set of fields called LAST_RF_CONNX_BTS and SECTOR (X=[1:3]) can be used to identify which sites/sectors the mobile is in soft or softer handoff with when the call ends. Alternatively, the INIT_MAHO_CANDX fields also indicate BTS/Sectors that the mobile will have been in soft(er) handoff with immediately after call setup. The only drawback from this approach is that this limits the number of reports of soft/softer handoff partners, since these CDL fields are only pegged at the end of the call. An example of this can be seen in the Browsed CDL shown previously in Figure 7.6.1.2-1. Softer handoffs between sectors 4, 5, and 6 are also verified by looking at the various RF_CONN fields. If one were to use this method, the call sampling ESNs should be used since they would generate many more CDLs than the Markov or continuous call ESNs. 115

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) 7.6.2.3 Soft/Softer Handoff Troubleshooting If a particular type of handoff is not happening or can not be located in the mobile data or CDLs, a special troubleshooting drive may need to be scheduled. Typical problems seen with handoff failures are incomplete Neighbor Lists or hardware issues such as GLI/BBX failure (puts a site/sector off the air), CSM failure (site/sector loses CDMA timing reference so mobile can not use this site/sector to enter into soft/ softer handoff with). The engineering team should investigate these problems and re-schedule a new drive test in the areas around a problem site/sector. The engineer should see real-time variations of combinations of PNs in the active set. If this real time troubleshooting exercise does not indicate that the problem has been solved, then the integrity of data collection tools should be checked. 7.6.3 RF Performance Verification for Each Sector For each sector evaluated during SCFT, the following RF performance items should be checked: Verify propagation pattern for each sector (Is correct PN being broadcast on correct antenna in the correct direction?) Is the sector transmitting an acceptable signal as evidenced by adequate Mob Rx levels at the mobile? Is the link balanced, as evidenced by the mobile transmit value conforming to the IS-95 link balance equation (Mob Tx = -(Mob Rx) - 73 + TxAdj)? Is there adequate signal quality at the mobile, as evidenced by reasonable Ec/Io levels?

Propagation Pattern and Ec/Io Verification The RF propagation pattern of a specific sector, also known as a PNs footprint, is a useful tool to determine if a particular sites RF plumbing and PN database are correctly set up. Many post-processing tools have the capability of graphically displaying a single PN offset. The method of displaying specific PNs will vary depending upon the tool. A Motorola tool capable of viewing a PN plot within the COMPAS tool can be found at: http://www.cig.nml.mot.com/cdma/kctopt/tools/ . Select PN Plot from the Tool Box. If all sites surrounding the new BTS are on the air during the SCFT, the following method can be used to verify antenna orientation and relative pilot strength. If the site has no immediate neighbors or its neighbors are not up yet, then simply assign one color for each PN. This would show where all the sectors propagate within one plot, but would not show each PNs strength. The following two figures (7.6.3-1 and 7.6.3-2) are examples of PN Plots generated with the PN Plot tool. The Ec/Io of the PN is displayed on a scale from black to light gray. Use of the PN Plot tool automatically gives the user an indication of the Ec/Io levels for each sector. A black & means the Ec/Io was greater than or equal to Tadd dB, a light gray & means the Ec/Io of that PN was less than Tadd dB. Notice how the PN footprint for values above Tadd for sector 6 looks evenly spread, albeit somewhat shifted counterclockwise. This slight shifting must be evaluated in terms of the roads driven. For example: Did the drive route enable uniform coverage around each sector, or did it skew the results one direction or another? Also, any obstacles that may reflect the energy from an antenna to a different area must be considered: Is there a large building across 116

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) the street from the antenna that acts as a reflector? One can see that energy from sector 6 shows up all around the site in all sectors, although a good portion of it is below Tadd. This is typical of a dense urban environment from which this sample was taken. More suburban and rural areas may have cleaner propagation patterns, but precautions should be taken in rough terrain (hills, valleys) areas. (Note that this 6-sector site also employs 60 -degree azimuth beam width antennas.) In contrast, Figure 7.6.3-2 shows an RF pattern for values above Tadd for sector one which is highly irregular. The PN for sector one is showing up as a strong influence in sectors 2, 5, 6, and along the border between sectors 3 and 4. However, the largest single occurrence of energy from sector 1 is in the bounds of sector 1. Does this positively guarantee that sector 1 antenna is pointing in the correct direction with the correct PN? Since this is again a dense urban environment, the answer is probably yes. However, the optimization engineer should continue to monitor this situation during subsequent optimization activities including the initial coverage test. Figure 7.6.3-1: PN Plot for Site 106, Sector 6:

117

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) Figure 7.6.3-2: PN Plot for Site 106, Sector 1

7.6.4

Mobile Receive and Transmit Level Verification

The next step of SCFT analysis is to determine if the mobile receive and transmit powers are within acceptable ranges for the new site/sector, and comply with the IS-95 link balance equation. Again, use the post-processing tool to view plots of each parameter. The diagram below is a general table of acceptable ranges for these RF parameters within the coverage footprint of a cell site/sector. Generally speaking if the levels fall outside of these bounds, that the mobile is outside of an adequate coverage area. Parameter RSSI Mobile Transmit Acceptable Level -40 to 90 dBm < +17 dBm

Of course, these values will vary depending on the terrain, amount of blockage between the sector and the mobile, mobile antenna length, antenna position, or many other variances within the drive test vehicle.

118

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) 7.6.5 Troubleshooting

In troubleshooting problems with new sites, make sure to check the obvious solutions first: Does this site/sector undergoing SCFT meet all the Entrance Criteria requirements? Was the site turned on and functional from the CBSC perspective? Were there problems with a specific mobile? Is the database correct for the new site? Does the site have the correct neighbor list? Is the PN offset correct? The Event Logs, which are logged on the OMCR, should be checked at the time of the test. Event Logs will sometimes show problems, such as alarms, that can affect a specific site or sector.

7.7

Exit Criteria

The single cell functional test data sheet is complete. Origination and termination tests pass successfully. Soft and softer handoffs occur as appropriate. The PN offsets and RF propagation patterns have been verified for each sector. Mobile receive and transmit levels are acceptable. Problem resolution matrix has been updated with any unusual findings.

119

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT)

APPENDIX 7A: Sample Markov or Continuous Call Data Collection Log Sheet
Date: Direction: Call # 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Start Time MOBID: Area: Start Location ESN: Phone Manufacturer & Version: End Time End Location File Name Operator:

120

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT)

APPENDIX 7B: Sample Call Sampling Data Log Sheet


Date: Zone: Ca ll Ty pe Jan 24, 2000 MOBID: Direction &: Area DM Time Call Proc Time Ref #
th

812-555-4365
New Site 389
Location

ESN:

03FF0C93

Operator

D8
# of Ca Ty ll # pe 1/ 1 2/ 2 3/ 3

Phone Manufacturer & Version: V No D- Op O Se An A Er X rvi FTO FTT RF alo H/ r Qu ce g O al.

Configuration (h

Active(s)

File Name

4 & Roswell St.


M-L M-L M-L 14:23:00 14 : 25:00 14: 27: 00 : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : :

1 1 1

214, 218, 320 222, 324, 418 332, 418

0124001423.qlc
.

4 & Fleming Ave. 4 & Becker Ln.


th

th

0124001425.qlc . 0124001427.qlc
. . . . . . . . . .

L-M 1/ 4 M-L M-L M-L 4/ 5 5/ 6 6/ 7

L-M 2/ 8 M-L M-L M-L 7/ 9 8/ 10 9/ 11

L-M 3/ 12

121

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT)

APPENDIX 7C: SINGLE CELL FUNCTIONAL TEST TRACKING SHEET


Cluster Name: EMXC/MM- 3 BTS and Sector List BTS-1-1 BTS-1-2 BTS-1-3 BTS-1-4 BTS-1-5 BTS-1-6 BTS-86-1 Etc . . First Tier Sites BTS-93-1 BTS-105-6 Etc. . . Cluster Location: Downtown Originations (Pass/Fail) Dates Tested: Number 1/3/99 of BTS: 1/15/99 16 Terminations (P/F) Soft HO (P/F) Softer HO (P/F) Antenna Direction/ PN Check (P/F) Mob Rx

RF Check

Mob Tx

EcIo

Comments

122

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT)

APPENDIX 7D: .TIM File Header (Description of .tim File Data Contents)
Relative Offset, Time, Date, Latitude, Longitude, Average Speed, Forward FER Average, Forward FER Sum, Forward Burst Error Max, Forward FER Samples, Reverse FER Average, Reverse FER Sum, Reverse Burst Error Max, Reverse FER Samples, Transmit Power Average, Receive Power Average, Spectrum Analyzer Energy, Eb/No, Traffic Channel Gain, Power Control Gain, Voice Gain, Combined Finger Ec/Io, Number of Active Pilots, Aggregate Active Ec/Io, Best Active PN, Best Active PN Ec/Io, Pilot Dominance, Aggregate Interfering Ec/Io, Worst Interfering PN, Worst Interfering PN Ec/Io, Active vs. Other Pilot Ec/Io Delta, State Of Call CBSC, Call Status, Call Reason, Layer 2 Message Type, Number of Times 0 Locked Fingers, Number of Times 1 Locked Fingers, Number of Times 2 Locked Fingers, Number of Times 3 Locked Fingers, Cell ID 1, Cell ID 1 Forward Power, Cell ID 1 RSSI for Main, Cell ID 1 RSSI for Diversity, Cell ID 2, Cell ID 2 Forward Power, Cell ID 2 RSSI for Main, Cell ID 2 RSSI for Diversity, Cell ID 3, Cell ID 3 Forward Power, Cell ID 3 RSSI for Main, Cell ID 3 RSSI for Diversity, BSC PN offset 1, BSC PN offset 2, BSC PN offset 3, Mobile PN offset 1, Mobile PN offset 2, Mobile PN offset 3, Finger PN offset 1, Finger Energy 1, Finger PN offset 2, Finger Energy 2, Finger PN offset 3, Finger Energy 3, Markov 9600 Exp Full Rx Full Bit Error, Markov 9600 Exp Full Rx B&B1, Markov 9600 Exp Full Rx Half, Markov 9600 Expected Full Rx Quarter, Markov 9600 Expected Full Rx Eighth, Markov 9600 Exp Full Rx Prob Bit Error, Markov 9600 Exp Full Rx Erasure, Markov 9600 Exp Full Rx B&B2, Markov 14400 Exp Full Rx Full Bit Error, Markov 14400 Exp Full Rx Full B&B1, Markov 14400 Exp Full Rx Full B&B2, Markov 14400 Exp Full Rx Half, Markov 14400 Exp Full Rx Half B&B1, Markov 14400 Exp Full Rx Half B&B2, Markov 14400 Expected Full Rx Quarter, Markov 14400 Exp Full Rx Quarter B&B1, Markov 14400 Exp Full Rx Quarter B&B2, Markov 14400 Expected Full Rx Eighth, Markov 14400 Exp Full Rx Eighth B&B, Markov 14400 Exp Full Rx Erasure, Voice 14400 Exp Full Rx Full B&B1, Voice 14400 Exp Full Rx Full B&B2, Voice 14400 Exp Full Rx Erasure, Voice 9600 B&B1, Voice 9600 Prob Bit Error, Voice 9600 Erasure, Voice 9600 B&B2, MCAP Full, MCAP Half, MCAP Quarter, MCAP Eighth, MCAP Invalid Full, MCAP Invalid Half, MCAP Invalid Quarter, MCAP Invalid Eighth, MCAP B&B, MCAP D&B, MCAP Full Erasure, MCAP Half Erasure, MCAP Quarter Erasure, MCAP Eighth Erasure, MCAP Full Errors, MCAP Half Errors, MCAP Quarter Errors, MCAP Eighth Error

123

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT)

APPENDIX 7E: COMPAS IS-95 Messaging Acronyms


Abbreviations ACACRM ACDBM ACM ACOM ACORGM ACPRM ACREGM ACRM ACSRM ACTCM AHDM AWIM EHDM FDBM FFWIM FOM FPCPM FSBDM HCM HDM ITSPM LAST_ACC_CH LAST_FWD_TCH LAST_PAGE_CH LAST_REV_TCH MSRM NLUM OCM PCACM PCAPM PCCAM PCCLM PCDBM PCENLM PCESPM PCFNM PCGPM PCGSRM PCNLM Message Types Access Channel Authentication Challenge Response Message Access Channel Data Burst Message Authentication Challenge Message Access Channel Order Message Access Channel Origination Message Access Channel Page Response Message Access Channel Registration Message Authentication Challenge Response Message Access Channel Status Response Message Access Channel TMSI Assignment Completion Message Analog Handoff Direction Message Alert with information Message Extended Handoff Direction Message Forward Data Burst Message Forward Flash With Information Message Forward Order Message Forward Power Control Parameters Message Forward Send Burst DTMF Message Handoff Completion Message Handoff Direction Message In-Traffic System Parameters Message Last Access Channel Information Message Last Forward TCH Message Last Page Channel Message Last Reverse TCH Message Mobile Station Registered Message Neighbor List Update Message Origination Continuation Message Page Channel Authentication Challenge Message Page Channel Access Parameters Message Page Channel Channel List Message Page Channel CDMA Channel List Message Page Channel Data Burst Message Page Channel Extended Neighbor List Message Page Channel Extended System Parameters Message Page Channel Feature Notification Message Page Channel General Page Message Page Channel Global Service Redirection Message Page Channel Neighbor List Messages 124

Optimization Procedures and Guidelines, Ver 1.2 Single Cell Functional Test (SCFT) PCOM PCPM PCSLPM PCSPM PCSRM PCSSDUM PCSTRM PCTAM PMRM PRM PSMM RDBM RFWIM ROM RPM RSBDM SCCM SCONM SEREQM SEROCM SM SOCM SPM SREQM SRESM STATRM STRM SUM SYNC TAC TAM TAC TAM Page Channel Order Message Page Channel Page Message Page Channel Slotted Page Message Page Channel System Parameters Message Page Channel Service Redirection Message Page Channel SSD update Message Page Channel Status Request Message Page Channel TMSI Assignment Message Power Measurement Report Message Parameters Response Message Pilot Strength Measurement Message Reverse Data Burst Message Reverse Flash With Information Message Reverse Order Message Retrieve Parameters Message Reverse Send Burst DTMF Message Service Connect Completion Message Service Connect Message Service Request Message Service Option Control Message Status Message Service Option Control Message Set Parameter Message Service Request Message Service Response Message Status Request Message Status Response Message SSD Update Message Sync Channel Message TMSI Assignment Completion Message TMSI Assignment Message TMSI Assignment Completion Message TMSI Assignment Message

125

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test

8.0 Initial Coverage Test


8.1 Description

The purpose of the initial coverage test is to characterize the coverage and performance of a group (cluster) of cells that provide contiguous coverage. The initial coverage test will identify coverage boundaries using measured (not predicted) data and initialize a call sampling benchmark. The initial coverage test will also continue to use the problem resolution matrix (PRM) to confirm predicted (via simulation) problems and record new problems measured during the initial drive test. The initial coverage test drive data, combined with the call statistics, will provide guidance to focus troubleshooting and optimization activities. The relationship of this initial coverage test activity to the entire optimization process is shown in Figure 8.1-1. Note that it is not the intention during this initial coverage test to perform any network optimization. The intent of the initial coverage test is to get an accurate snapshot of initial system performance as designed. Corrective actions may be required to ensure that the system performance snapshot is an accurate one, void of any system hardware and software problems. Given a stable, accurate snapshot of the network performance, then optimization can start. Optimization is addressed as part of Chapter 9, Detailed Problem Resolution.

126

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test

Optimization Preparation
Equipment Installation and Test Verification (Chapter 3)

Network Design Verification (Chapter 2)


Accurate Terrain, Clutter, Xlos Tuning Data System Design via NetPlan/CSSS

RF Parameters Database Verification (Chapter 4)

Spectrum Clearing (Chapter5)

Data Collection and Analysis Tools Selection, Install, Test (Chapter 6)

Network Optimization Single Cell Functional Test (Chapter 7) Initial Coverage Test (Chapter 8) System Optimization and Detailed Problem Resolution (Chapter 9) Final Coverage Survey and Warranty Testing (Chapter 10)

System Operations (Chapter 11)

Commercial Service: Network Performance Monitoring and Expansion (Chapter 11)

Figure 8.1-1 Relationship of Initial Coverage Test Activity to Entire Optimization Process

127

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test

8.2

Tools Required

Table 8.2-1 contains a list of tools required to conduct the initial coverage test. Additional logistics and support items may be needed. Lead engineers should plan for efficient operations. Item Vehicle Description Quantity Preferably a van with enough room for all data 1 per team; gathering equipment plus DM operators. Should be number of teams equipped for drive testing, including power source and per cluster TBD routing for external antennas if necessary. by lead engineer Which can be used as a DM containing large hard 2 per drive test drive (e.g. 2 GB), compatible with DM and GPS H/W vehicle & S/W, phone interface Capable of collecting IS-95 messages in different 1 per Laptop modes of operation (e.g. Markov, various rate sets) Position locating receiver compatible with DM 1 per DM software and laptop computer. 1 Per DM

Lap Top Computer DM Software GPS

CDMA Phone Phone must have valid ESN and phone # for with extra current system. batteries or power adapter HP Pilot Scanner HP E7472A RF Measurement System or Equivalent for proper frequency Analog or other Used for coordinating activities with test leader and non-test phone contacting MTSO personnel or CFEs as needed, or for emergency purposes. SMAP Used for collecting messaging and Reverse Link FER at the BTS/CBSC. RF Performance COMPAS, OPAS or equivalent that will enable Analysis Post plotting of RF performance characteristics Processing Tool CDL Analysis Tool (CAT) PM Reports

1 per team 1 per vehicle

1 per MM Scale to accommodate number of clusters 1

Used to verify system stability as part of postprocessing and analysis. Network performance statistics that are displayed 1 per CBSC from the CBSC Event Logs Used to verify system stability during initial coverage 1 per CBSC test Color Plotter or Hewlett Packard or Equivalent Scale for number Printer of clusters Table 8.2-1: Tools Required for Initial Coverage Test

128

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test

8.3

Personnel Required
Skills Good computer background, capable of operating DM, and CDMA phone. (See Appendix A) Valid drivers license. Must keep safety and comfort of data collectors in mind at all times. Strong in Unix operations and experienced in Motorola infrastructure equipment. (See Appendix A) White Belt through Blue Belt as appropriate (See Appendix A) Table 8.3-1: Personnel Required

Personnel DM Operator Driver CBSC Engineer System Engineer

8.4

Entrance Criteria

For the cluster that will be tested, collect the data sheet(s) filled out during Single Cell Functional Test (Chapter 7) and examine them to verify that all cells in the cluster and the first tier surrounding sites have passed the SCFT. The cells must all be in service and stable with no service affecting alarms during the time of data collection.

8.5

Procedure

The following activities are required in order to obtain an accurate report of the networks performance as designed: Generate Drive (Metric) Routes Tools Preparation Collect Performance and System Stability Data Evaluate System Stability Data Process and Plot Performance Data, Calculate Call Statistics Data Analysis, Update Problem Resolution Matrix

Each of these is described below. 8.5.1 Generate Drive (Metric) Routes: Guidelines are presented to develop coverage test drive routes: Obtain road maps for the cluster area and identify drive test or metric routes for the cluster. Metric routes should focus on traversing the coverage area. The route should go through all sectors and extend to the edge of the predicted coverage areas (use the simulation outputs to guide this part of the activity) and into coverage areas of surrounding tier sites (far enough to enter into soft handoff with the first tier sites). Special attention should be paid to points or routes of extreme interest such as stadiums, arenas, interstate or major highways, and locations where many 129

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test users may congregate. The customer should be consulted to help prioritize these areas. Drive routes should be designed so they can be completed in a single shift. Include time for breaks, lunches, dinners, etc.

After the engineer specifies preliminary drive routes, the task of final route selection should be given to the drive teams. The final drive routes should account for one way streets, construction, etc. 8.5.2 Tools Preparation The following tools will need to be configured properly for operation in the cluster under test: - DM and/or Pilot Scanner - SMAP - Post-processing tools, including plotter and/or printer - Phone 8.5.2.1 DM/Phone Configuration The DM should be properly configured to collect data in the cluster of interest. This will include setting up the PN-to-BTS/Sector conversion tables (named differently for various DM products) so they are properly displayed on the real-time DM displays. Examples of configuration items to verify would be slot cycle index, antenna configuration (e.g. handheld or external mount w/ attenuators), etc. Precaution: In areas of ICBSC-SHO, Markov mode should not be used if ICBSCSHO is enabled. Markov calls are prohibited from doing anchor handoffs. For this reason, clusters should be defined so they generally do NOT cross CBSC seams. In areas near CBSC seams, use continuous voice calls (perhaps pegged to a radio station re-broadcast) to collect continuous data. (There are separate issues associated with FER reporting when not using Markov calls. These are discussed below.) 8.5.2.2 HP Pilot Scanner Configuration The HP pilot scanner will require configuring per the users manual. Primary considerations are antenna placement for CDMA and GPS, connections between antennas, receiver unit, phone (if used), PC, and dead reckoning sensors if used. In addition, the software should be installed and verified. 8.5.2.3 SMAP BSC Configuration SMAP profiles should be set up to collect data for the ESNs being used. Priority should be given to the Markov or continuous call mobiles. In smaller systems with only a few test mobiles, all test mobiles can be logged. SMAP has a limit of logging 10 ESNs at time if configured properly. Refer to the SMAP documentation to identify and set up the correct profile or see http://scwww.cig.mot.com/~thakkar/smap.html for information on the ISMAP script.

130

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test 8.5.2.4 Post-Processing Tools Configuration Refer to the individual tools users guides to install and test the post-processing and analysis tools. These will include COMPAS/OPAS32/or equivalent, CDL Analysis Tool, PM Reports, and others as listed in Table 8.2-1. Many of these will already have been set up to facilitate the SCFT discussed in Chapter 7. 8.5.3 Collect Performance and System Stability Data Data collection activities are broken down into (real-time) performance data collection and (post-drive) stability and performance data collection. 8.5.3.1 Real-Time Drive Data Collection The initial coverage drive should collect both call sampling and Markov data. Call sampling data can be collected in one of two manners. The first method would be the most efficient, taking advantage of DM features such as drummer mode (CAMPS) to automatically place calls at prescribed intervals for specific durations. Since originations require more IS-95 message exchanges, this method also provides a more complete test of the system than terminations. Using this method, all calls placed along the entire drive route would be mobile-initiated. If this approach is used during the initial coverage drive and subsequent metric drive routes, then at some point in the optimization cycle a special termination-only test should be performed. This can be facilitated by using a special termination test script found on the NSS Best Practices web page. Click on the scripts button at: http://www.rochellepark.pamd.cig.mot.com/cdma/bestpractices.html. Download either the 16 Bit AutoDial! script Version 2.3 for Windows 3.1/95 or 32 Bit AutoDial! script Version 3.1 for Windows 95/NT. See the Readme information for additional details on this tool. The data sheets in Appendix 7B should be modified for the all-origination or all-termination test scenario. The second approach would be to collect call data using the customer call model. The customer should be consulted to determine the ratio of land-to-mobile and mobile-to-land calls and the typical duration of each call. If used the call sampling data sheets shown in Appendix 7B should be modified to reflect the call model. The Markov data collection sheet shown in Appendix 7A should be used to collect Markov data. To minimize data loss, the maximum duration of Markov calls should be 30 minutes. In addition, the following activities should be undertaken during the data collection period: Problem Reporting: An appropriate command and control structure should be set up so that problems in the field can be resolved as quickly as possible. Reports of outages should be dispatched immediately to CBSC operators and/or CFE crews to correct the situation. The lead engineer should act as the switchboard to direct all real-time problem resolution. CBSC Monitoring: MTSO personnel should be notified which cluster(s) are being driven so they know which CBSCs and BTSs to monitor for stability and availability. This requires continuous vigilance for device outage and alarm conditions. Any service affecting outage should be reported to the lead engineer so he may notify drive crews and test leaders as quickly as possible. The SMAP 131

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test AP should be checked periodically to ensure that it remains operational and continues collecting SMAP data. This can be done by periodically checking the SMAP file sizes. Any problems should be logged in a notebook. CFE Crew Availability: CFE crews should be on standby to address any BTSrelated outages that occur during the drive test. DM Monitoring: The drive team should be trained to continuously monitor the DM to make sure that it has not locked up and that the system always appears to be operating properly.

8.5.3.2 Post-Drive Data Collection Set up a logical directory structure that accounts for multiple types and sets of data collected on multiple days. A sample directory structure that can be used to store data on analysis computers is shown in Appendix 8A. Following the drive test activity, the data sets listed below will need to be collected in the appropriate directory in a central location for follow-up processing and evaluation: Drive test data for all test mobiles, including call sampling and Markov (or continuous) call mobiles. Transfer the mobile files to the appropriate directory on the analysis computer. SMAP data should be retrieved if the tool was used during the drive test. This will be used as input to generate RF performance plots. This data is typically transferred across a network to the analysis computer. CDLs for the ESNs that were used, and for the duration of the drive test, should be extracted from the OMC-R. This data is typically transferred across a network. Event logs for the period of the drive test should be collected. This data is typically transferred across a network. Caution: The optimization teams should adopt the policy to move data from the OMC-R and CBSC APs to different processing machines so that no data processing occurs on the operational system. 8.5.4 System Stability Verification

This exercise includes verification that the system was stable during data collection. System stability can be quickly verified for the time during which data was collected by using the Performance Management Reports (PMReports). PMReports are network performance statistics that are displayed from the CBSC. The PMSUM Suite of scripts is derived from the PM records, which are collected in the OMCR and reside on the AP (Application Processor). PMSUM suite of scripts combines the most widely used PM reports into four reports for easy usage. 1. The PMMCC report, will provide information for every channel with traffic. The report contains information on channel usage, OOS time, originations, terminations, access failures, RFloss and channel usage time per attempt. This 132

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test information is provided in the form of a report that can be easily viewed for each cell in the drive area as shown in figure 8.5.4-1 below. 2. The CEM report, which provides information on unplanned BTS availability, total BTS availability, and alarms within a 24hr period. Example reports are shown below in figure 8.5.4-2. 3. The PMSUM report, which includes the Cell RF Loss and Access Report, the Cell Handoff Report, the MM Summary, the Carrier Summary, and the Worst 15 Cells Failing above 1%. These reports give details regarding walsh code hours, D-D, DA, Soft and Softer handoffs, as well as much more. An example of each report is shown in figure 8.5.4-3. 4. The PMTRAF report reports traffic data including traffic minutes and load % as shown in figure 8.5.4-4. After obtaining these reports review them to ensure that there were no outages or alarms and that every cell/sector in the area being driven had channel usage. There may be instances where a cell/sector in the area being driven will not report usage due to the drive route designed for this area. In this case, it is important to review the first two reports mentioned above. For executables, as well as documentation for the PMSUM suite of scripts visit the Rochelle Park Software Archive web page: http//www.rochellpark.pamd.cig.mot.com/software.html in the pmsum section.

133

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test Figure 8.5.4-1: PMMCC Report MCC AND CHANNEL REPORT
Cell 5 5 5 5 5 5 5 5 5 5 5 5 5 5 MCC CE 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 3 4 5 6 7 0 1 2 3 4 5 6 7 Usage min 61.8 84.5 53.5 64.7 67.5 59.6 54.5 81.5 74.3 76.3 58.2 63.5 62.5 53 OOS min 19.2 19.2 19.2 19.2 19.2 19.2 19.5 19.5 19.5 19.5 19.5 19.5 19.5 19.5 Orig Atts 34 25 24 31 27 21 27 21 23 26 20 21 26 29 Orig Comps 29 23 23 28 27 20 26 19 22 23 20 19 23 27 Orig Fail % 14.7 8 4.2 9.7 0 4.8 3.7 9.5 4.3 11.5 0 9.5 11.5 6.9 Term Atts 7 11 17 13 6 13 12 12 13 15 13 18 10 8 Term Comps 7 9 14 11 6 11 10 11 11 15 12 16 9 7 Term Fail % 0 18.2 17.6 15.4 0 15.4 16.7 8.3 15.4 0 7.7 11.1 10 12.5 RF Loss 2 4 7 3 2 3 3 1 5 2 2 6 2 3 RF Loss % 5.6 12.5 18.9 7.7 6.1 9.7 8.3 3.3 15.2 5.3 6.2 17.1 6.2 8.8 Usage/ Att 90.4 140.9 78.2 88.2 122.8 105.1 83.9 148.1 123.8 111.6 105.8 97.6 104.2 85.9

Equation Notes: CE: Channel element Usage min: Number of minutes channel element is in use OOS min: Number of minutes channel element is OOS Orig Atts: Origination Attempts Orig Comps: Origination Completes Orig Fail%: Origination (Attempts - Completes) / Attempts Term Atts: Termination Attempts Term Comps: Termination Completes Term Fail%: Termination (Attempts - Completes) / Attempts RF Loss: Total of 1+2+3 way RF Losses on channel element Usage/Att: Average number of usage seconds per attempt

134

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test Figure 8.5.4-2: CEM Report 8.5.4-2A BTS/MCC Availability
MM 3 3 3 1 1 2 2 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 BTS 26 26 26 152 164 169 169 191 191 193 193 207 207 304 308 308 308 308 310 310 310 310 Sec 1 2 3 3 3 1 2 1 2 1 2 1 2 2 1 2 3 4 1 2 3 4 Tot MCCs 12 12 12 12 12 6 6 4 4 4 4 4 4 12 10 10 10 10 12 12 12 12 Available Minutes 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 1440 OOS Redundant Minutes 0 0 0 0.3 0 0.1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OOS Manual Minutes 0.5 0.5 0.9 1379.7 0.5 0 0 0 0 0 0 0 0 0.5 0 0 0 0 0 0 0 0 OOS Automatic Minutes 0.2 0 0.2 0 0.2 3.5 3.8 1.3 1.3 744.4 742.2 1 1 0 15.5 15.5 15.5 15.5 8.3 8.3 8.3 8.3 Unplanned OOS_AUTO Avail% 99.99% 100.00% 99.99% 100.00% 99.99% 99.76% 99.73% 99.91% 99.91% 48.31% 48.46% 99.93% 99.93% 100.00% 98.92% 98.92% 98.92% 98.92% 99.42% 99.42% 99.42% 99.42% Total Avail% 99.95% 99.97% 99.92% 4.19% 99.95% 99.76% 99.73% 99.91% 99.91% 48.31% 48.46% 99.93% 99.93% 99.97% 98.92% 98.92% 98.92% 98.92% 99.42% 99.42% 99.42% 99.42%

135

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test 8.5.4-2B Device Outage and Alarm Listing Alarms Maj Min

MM Device

Initial Status

Final Status

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

CBSC-1 FEP-1-1-8 FEP-1-1-13 MSIP-1-1-6 MSIP-1-1-8 MSIP-1-1-17 MSIP-1-1-26 MSIP-1-1-27 MSIP-1-1-52 MGLI-5-2 GLI-5-3 GLI-5-4 BBX-5-21 BBXR-5-21 BDC-5-3 MCC-5-21 MCC-5-22 CSM-106-1 BBX-152-1 BBX-152-3 BBXR-152-1 BBX-152-21 BBX-152-23 BBXR-15221 CSM-153-1 BBX-153-1 BBX-153-2 BBX-153-21 BBX-153-22

INS INS INS PRECUT PRECUT PRECUT PRECUT PRECUT PRECUT OOS_MANUAL OOS OOS OOS_MANUAL OOS_MANUAL OOS_MANUAL OOS_MANUAL OOS_MANUAL INS INS OOS_MANUAL OOS_MANUAL INS OOS_MANUAL OOS_MANUAL INS INS INS INS INS

INS INS INS PRECUT PRECUT PRECUT PRECUT PRECUT PRECUT OOS_MANUAL OOS OOS OOS_MANUAL OOS_MANUAL OOS_MANUAL OOS_MANUAL OOS_MANUAL INS INS OOS_MANUAL OOS_MANUAL INS OOS_MANUAL OOS_MANUAL INS INS INS INS INS

OOS Manual (min) 0 0 0 0 0 0 0 0 0 1440 0 0 1440 1440 1440 1440 1440 0 0 1380 1380.2 0 1379.3 1379.6 0 0 0 0 0

OOS Auto (min) 0 0 0 0 0 0 0 0 0 0 1440 1440 0 0 0 0 0 0 0 0 0 0 0 0.1 0 0 0 0 0

OOS Cri Parent (min) 0 0 0 1 0 1 0 0 0 23 0 18 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0

0 0 0 2 9 20 0 0 14 0 0 0 0 0 0 0 0 0 8 0 0 19 0 0 0 3 2 6 1

3 0 0 1 8 2 0 1 7 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

136

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test 8.5.4-2C Alarm Summary Total First
2 3 2 7 5 6 5 4 4 4 2 2 1 2 1 7 7 1 3 257 182 429 437 139 0:22 11:22 12:33 0:37 11:22 11:29 14:31 14:31 11:22 11:22 12:05 1:12 15:23 0:46 15:24 2:08 2:08 12:15 12:15 7:27 8:23 8:06 6:38 0:27

Last
10:33 16:49 14:31 15:24 14:31 14:31 14:31 15:24 14:31 15:24 15:27 2:01 15:23 10:10 15:24 23:58 23:58 12:15 12:15 18:16 17:58 19:24 21:51 20:33

Sev Alarm # Card


** 1-1038 1-1351 1-1352 1-1353 1-1354 1-1355 1-1356 1-1357 1-1372 1-1373 1-1374 ** ** *** *** * * * * ** ** ** ** ** Jan-50 Jan-10 Jan-11 Jan-12 Jan-02 Jan-12 Jan-20 Jan-20 Jan-41 Jan-42 Jan-43 Jan-51 Jan-52 MGLI MGLI MCC MCC MCC MCC MCC MCC BBX BBX BBX LCI FEP FEP FEP BBX BBX BBXR BBX BBX BBX BBX BBX BBX

Description
MF FAN #8 - Low Speed Alarm BTS LAPD GLI-MM Datalink - Failure: Recovered BTS LAPD GLI Datalink #1 - Failure: Recovered BTS LAPD GLI Datalink #2 - Failure: Recovered BTS LAPD GLI Datalink #3 - Failure: Recovered BTS LAPD GLI Datalink #4 - Failure: Recovered BTS LAPD GLI Datalink #5 - Failure: Recovered BTS LAPD GLI Datalink #6 - Failure: Recovered BTS LAPD GLI Datalink #21 - Failure: Recovered BTS LAPD GLI Datalink #22 - Failure: Recovered BTS LAPD GLI Datalink #23 - Failure: Recovered LPA Unit: Pilot Tone Suppression Alarm Detected FEP Detected Unroutable SCAP Message - Link not INS FEP Detected Unroutable SCAP Message - Path not Open FEP Detected Unroutable SCAP Message - Invalid Path BBX - CHI Link B: Frame Sync Error BBX - CHI Link B: Frame Slip Error BBX Detected Forward Link Parity Burst Error BBX Detected Forward Link Parity Burst Error Forward Power Very High Alarm: Sector 1 Forward Power Very High Alarm: Sector 2 Forward Power Very High Alarm: Sector 3 Reverse Noise Rise Very High Alarm: Sector 1 Reverse Noise Rise Very High Alarm: Sector 2

Once the system and has been evaluated to determine that the data is worthy of being processed and has some value, then the data should be processed. Not all service affecting outages would necessarily dictate that the data does not have some value. Engineers will have to make judgement calls to determine if the data has merit. 8.5.5 Process and Plot Performance Data, Calculate Call Statistics

8.5.5.1 Process and Plot Performance Data Using the post processing tool(s) selected and installed (Chapter 6), generate the following, minimum set of RF performance plots: Forward Link Performance Plots: - Mobile Receive - Ec/Io - Frame Erasure Rate (FFER) Reverse Link Performance Plots: 137

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test Mobile Transmit Frame Erasure Rate (RFER) (this will only be possible if SMAP data is available)

It will also be useful to plot and evaluate (if possible) the Active vs. Other Ec/Io Delta. This ratio gives an indication of where the active set pilots are relatively stronger (or weaker) than the rest of the pilots in an area, and can be useful in repairing possible neighbor list and pilot pollution issues. 8.5.5.2 Calculate Call Sampling Statistics For the call sampling ESNs used along the metric route, calculate call completion and call drop statistics from the perspective of both the drive team and the CDLs. Any operator errors should be culled from the statistics. Call Completion Rate For drive test data, the formula to use for calculation of call completion rate (CCR) is: CCR = Number of Completed Calls Number of Attempted Calls

All operator errors should be excluded from this calculation. For CDLs, the formula to use for calculation of call completion rate is: CCR = S / X Where: S = total number of calls completed as defined as the sum of all CDLs whose LAST_MM_SETUP_EVENTS is less than or equal to 17 and LAST_MM_SETUP_EVENTS is greater than or equal to 23. X= total number of calls (i.e. CFCs 1+3+4+5+9+13+) There may be a slight difference between these sets of statistics because some origination attempts may have never been seen or registered on the system. In addition, CDL call completion relies on the mobile reaching operation on the traffic channel. In some instances, calls will get to traffic channel, but there will be no audio. Typically, the operators do not regard these as completed calls. Note: The CDL Analysis Tool (CAT) generates call completion rates based upon specific releases. Refer to CAT documentation for specifics.

138

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test Call Drop Rate The formula to use for calculation of call drop rate (CDR) using drive test data is: CDR = Number of Calls Ended Normally by Either DM or Operator After Full Call Duration Number of Completed Calls

The formula for calculation of call drop rate using CDLs is: CDR = All RF Losses (CFC 4s) All Ss except CFC 1s and 25s

Note: The CDL Analysis Tool generates statistics for call drop rate that are specific to a given release. These baseline statistics should be recorded or logged to start developing a performance history profile for this cluster. 8.5.6 Data Analysis, Update Problem Resolution Matrix Following the guidelines outlined in Section 8.6, perform an initial assessment of the RF performance of the cluster under investigation. This exercise will be used to confirm if the system design that was prepared for the area is accurate or not. In addition, assuming the system was stable during the data collection, then this exercise will also identify any other problems that were not originally indicated with the network planning tools. Once the area has been analyzed, then the Problem Resolution Matrix (PRM) should be updated to capture all the problems. This PRM may be reviewed with customers to help prioritize resolution of the issues.

8.6

Analysis Conducted

Analysis for this initial coverage test data is limited to review of the RF performance plots to identify areas that do not meet specific RF performance criteria. Each of the RF performance indication plots should be reviewed with the simple guidelines that follow. Particular focus should be given to revisit problems already existing in the PRM. All problems should be captured in an update to the PRM. 8.6.1 Forward Link Performance Evaluation The forward link performance indicators evaluated should include Mobile Receive, Ec/Io, FFER, and Active vs. Other Ec/Io Delta. If the basic criteria listed below are not met, the area on the map should be highlighted, defined as a problem, and entered into the PRM.

139

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test 8.6.1.1 Evaluation of Mobile Receive Strength Receive power is a composite measurement indicating the signal strength in the CDMA band of interest at the antenna. This level is primarily affected by the number of pilots serving a particular area and the distance between the mobile and the BTS antennas. Terrain, foliage, buildings or highway structures between the mobile and BTS antennas will reduce the measured signal strength at the mobile. Table 8.6.1.1 shows suggested cutoff levels for acceptable signal strength based upon the number of pilots contributing to that composite measurement. Please keep in mind that these are suggestions derived from field work, where fading, vehicle velocity, antenna placement, and other factors affect the real world measurements. (I.e. Results that would be obtained in a wired lab or controlled experiment would be much better.) Acceptable Mobile Receive Level (dBm) 1 -87 2 -84 3 -82 >3 -80 Table 8.6.1.1: Relationship Between Number of Pilots Serving an Area and Acceptable Mobile Receive Signal Strength Naturally, signal strengths measured closer to BTSs should be higher. If they are not, this would be an indication of a potential problem. 8.6.1.2 Evaluation of Ec/Io Ec/Io is a measurement of usable energy in each chip (for a specific PN) as compared to the total noise at a particular location. This value is used to trigger handoffs at the mobile. Any areas where all pilots are consistently below Tadd are a problem. These locations should be noted in the PRM. For these cases, signal strengths of the best potential server(s) will be primary candidates for increasing their signal presence in the area, via either antenna tilt or SIF power changes, in order to provide adequate signal quality. This problem may or may not be combined with poor Mobile Receive power levels. In addition, suboptimal neighbor lists can create poor Ec/Io, regardless of receive levels. Areas that have multiple pilots present, where all pilots fluctuate between Tadd+3dB and Tdrop-3dB are areas referred to as non-dominant pilot areas. In this case, some of the serving sectors will have to be identified as candidates for power increases and/or decreases. The key is to create an appropriate number of adequate pilots. If N-way is available, up to six pilots are useable. Without N-way, there should be no more than three dominant pilots in an area. Tradeoffs need to be made in surrounding areas. In any case, this situation deserves to be an entry in the PRM. 8.6.1.3 Evaluation of Forward Link Frame Erasure Rate (FFER) Forward FER typically is associated with areas of poor Ec/Io. However, there may still be areas of acceptable Ec/Io that have somewhat elevated FER. All locations where FER 140 Number of Serving Pilots

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test rises above 2% should be a PRM entry. It is likely that this area can be improved unless terrain, manmade obstacles, or interferers simply prevent adequate signal-to-noise ratio at the mobile location. 8.6.1.4 Evaluation of Active vs. Other Ec/Io Delta This performance parameter indicates where interference from useable pilots is simply an issue. If Active vs. Other Ec/Io < 0 in any particular area, then this location should be the target of an investigation and entered into the PRM. If the Active vs. Other Ec/Io > 0, the pilots contributing to the active set are stronger than the ones not in the active set. If the Active vs. Other Ec/Io is < 0, this means the composite energy is greater than the energy of the pilots in the active set, and therefore, create interference. 8.6.1 Reverse Link Performance Evaluation Primary reverse link performance indicators include Mobile Transmit Power and Reverse Frame Erasure Rate (RFER). 8.6.1.1 Evaluation of Mobile Transmit Power The maximum mobile transmit level allowed per IS-95 is 23 dBm. Most system designs try to maintain 6 dB of headroom under that limit. Assuming that the IS-95 link balance equation is adhered to, and using the lower limit of 87 dBm for an acceptable forward link receive level, any areas where Mobile Transmit levels exceeds + 14 dBm should be a PRM entry. [Mob Tx = - (Mob Rx) + 73 + NomPwr + InitPwr] 8.6.1.2 Evaluation of Reverse Link Frame Erasure Rate (RFER) Similar to the forward link, if FER on the reverse link rises above 2% in an area, this area deserves to be an entry in the PRM.

141

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test

8.7

Exit Criteria

The following exit criteria should be met prior to taking steps to change the system (Chapter 9). 1. Data should be collected and verified as accurate and representative of initial system design. Some locations in the cluster may require data re-collection if there were any service affecting problems in those areas while data was collected. 2. Call sampling statistics have been calculated and computed using drive team data and CDLs respectively. 3. Cluster wide plots have been generated for the following RF performance attributes: - Forward Link - Mobile Rx - Ec/Io - Forward FER - Active vs. Other Ec/Io Delta (desirable, but not necessary) - Reverse Link - Mobile Tx - Reverse FER (if SMAP data is available) 4. The problem resolution matrix has been updated based upon the preliminary analysis outlined in Section 8.6.

142

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test

Appendix 8.A: Sample Directory Structure

1. For Mobile logs, CDLs, Event logs, SMAP logs and SCUNO (or other information) from the field: /Hard Drive Partitions/Project Name/Subproject Identifier/Field_Data/Date/CBSC/Type of data/ - Hard Drive Partition (e.g. local_data) - Project Name (e.g. JCDMA) - Subproject Identifier (e.g. KCT) - Field_Data (use Field_Data as the directory name) - Date (YYMonDD e.g. 98Apr02) - CBSC (e.g. B1) - Type of Data (e.g. mobile, CDLs, Events, SMAP, etc.) Under Mobile directory: /CS (for call sampling logs) /ESN (e.g. 088d0c85) /Type of failure (e.g. FTO, DROP, etc.) /CS_out (for COMPAS reports) /Mkv (for Markov logs) /ESN (e.g. 04030c85) / Type of failure (e.g. FTO, DROP, etc.) /Mkv_out (for COMPAS reports)

2. For system configuration files (Antenna tilts, neighbor lists, etc.): /Hard Drive Partitions/Project Name/Subproject Identifier/System_Config/ - Antenna tilt list (naming convention: MMDD<System>_<name>; e.g. 0406K_antenna_tilt_list) - Neighbor lists (naming convention: MMDD<System>_<name>; e.g. 0415Q_master_list) - MIB (e.g. SIF pilot power list, CLI commands, etc.) 3. For Plots: /Hard Drive Partitions/Project Name/Subproject Identifier/Plots/Date/ - Plots (naming convention: <Area>_<variable>.plot; e.g. B1_EcIo.plot) - Variables: EcIo, Tx, Rx, Fails

143

Optimization Procedures and Guidelines, Ver 1.2 Initial Coverage Test 4. For Tools/Scripts: /Tools/ - pgrep - banditview - etc. 5. NetPlan Projects and Information (color files, etc.): a) Analysis path and naming convention: NetPlan/Projects/Subproject ID/ - MMDD<cluster><call type><time> - Cluster = B1, B3, etc. - Call Type = C (Call sampling), M (Markov), S (Special test) - Time = N (Night); D (Day) - (e.g. 0403KB3CN) b) Color Files: NetPlan/color_files

144

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization

9.0 RF Network Optimization


9.1 Description

The purpose of RF network optimization is to ensure each cluster is integrated into the overall network and prepare the CDMA system for commercial service. When this activity is completed, the network should pass the final coverage test and contractual warranty performance criteria. These are the performance targets that should be achieved by the network optimization team. This chapter provides guidance on characterizing and resolving a variety of common problems, with the emphasis on resolution of RF performance problems. Problems can generally be classified into four main categories as follows: System Design RF coverage, multiple pilot problems, parameter settings Infrastructure database errors, hardware and software problems Subscriber Unit phone problems Equipment or Processing problems with data collection or post-processing tools References for basic optimization activities such as neighbor list prioritization and cell radius checking are provided. Detailed analysis for RF coverage problems will include both the forward and the reverse links. Multiple pilot problem analysis will cover solutions for both too many pilots and lack of a dominant pilot. Infrastructure and subscriber unit problems will be discussed in less detail, as these are typically related to a specific release or product issue. Additional reference material is contained in Parameters and Optimization found at http://www.cig.mot.com/~dillon/. It is not the intent of this chapter to recreate that information. Optimization engineers should read that document and see http://scwww.cig.mot.com/people/cdma/PjM/product/release_info/ for specific information for the release being optimized. The relationship of this activity to the overall network optimization process is shown in Figure 9.1-1.

145

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization

Optimization Preparation
Equipment Installation and Test Verification (Chapter 3)

Network Design Verification (Chapter 2)


Accurate Terrain, Clutter, Xlos Tuning Data System Design via NetPlan/CSSS

RF Parameters Database Verification (Chapter 4)

Spectrum Clearing (Chapter5)

Data Collection and Analysis Tools Selection, Install, Test (Chapter 6)

Network Optimization Single Cell Functional Test (Chapter 7) Initial Coverage Test (Chapter 8) RF Network Optimization (Chapter 9) Final Coverage Survey and Warranty Testing (Chapter 10)

System Operations (Chapter 11)

Commercial Service: Network Performance Monitoring and Expansion (Chapter 11)

Figure 9.1-1: Relationship of System Optimization and Detailed Problem Resolution

9.2

Tools Required

Table 9.2-1 contains a list of tools required to conduct the RF network optimization. Additional logistics and support items may be needed. Lead engineers should plan efficient data collection and processing operations. Specific considerations should include number of drive teams and vans, arrange night driving in crowded, congested areas, and avoiding rush hour whenever possible with basic metric routes, scheduling of CBSC operators and CFE personnel to support drive testing, availability of data collection and post-processing tools. Other specific market concerns should be addressed as required.

146

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization Item Vehicle Description Quantity Preferably a van with enough room for all data 1 per team; gathering equipment plus DM operators. Should be number of teams equipped for drive testing, including power source and per cluster TBD routing for external antennas if necessary. by lead engineer Which can be used as a DM containing large hard 2 per drive test drive (e.g. 2 GB), compatible with DM and GPS H/W vehicle & S/W, phone interface Capable of collecting IS-95 messages in different 1 per Laptop modes of operation (eg. Markov, various rate sets) Position locating receiver compatible with DM 1 per DM software and laptop computer. Phone must have valid ESN and phone # for current system. 1 Per DM

Lap Top Computer DM Software GPS CDMA Phone with extra batteries or power adapter Analog or other non-test phone SMAP RF Performance Analysis Post Processing Tool CDL Analysis Tools PM Reports

Used for coordinating activities with test leader and contacting MTSO personnel or CFEs as needed, or for emergency purposes. Used for collecting messaging and Reverse Link FER at the BTS/CBSC. COMPAS, OPAS or equivalent that will enable plotting of RF performance characteristics

1 per vehicle

1 per MM Depends on how many clusters are being optimized 1

Used to verify system stability as part of postprocessing and analysis. Network performance statistics that are displayed 1 per CBSC from the CBSC Event Logs Used to verify system stability during system 1 per CBSC optimization Table 9.2-1: Tools Required for RF Network Optimization

147

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization

9.3

Personnel Required

Key personnel required to conduct RF network optimization are identified in Table 9.3-1. Additional persons and skill sets may be required to address logistics and planning issues. Personnel DM Operator

Skills

Good computer background, capable of operating DM, and CDMA phone. (See Appendix A) Driver/Navigator/M Valid drivers license. Must keep safety and comfort of data ap Maker collectors in mind at all times. CBSC Engineer Strong in Unix operations and experience in Motorola infrastructure equipment. (See Appendix A) System Engineers White Belt through Blue Belt as appropriate (See Appendix A) Table 9.3-1: Personnel Required

9.4

Entrance Criteria

1. An updated Problem Resolution Matrix contains a prioritized list of all the problem areas identified in the initial (or most recent) coverage drive and latest simulation activities. 2. Plots of the following RF performance are available from the previous drive test: Forward link performance: Mobile Receive Forward FER Aggregate Active Ec/Io Active vs. Other Ec/Io Delta (to give an indication of pilot pollution) Reverse Link performance: Reverse FER Mobile Transmit 3. A current list of Antenna azimuth and downtilt angles and SIF powers is available. These will be used to help generate recommendations for changes. 4. Contract warranty performance criteria are set and understood by the optimization team.

9.5

Procedure

The high level, iterative procedure to determine whether the optimization activity for the initial network configuration has been completed is shown in Figure 9.5-1. Initially, the input data to this activity will be the Initial Coverage Test described in Chapter 8. The data will be used to support the basic optimization processes including integration of each cluster into the overall network, neighbor list prioritization, cell radius checks, RF 148

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization coverage analysis, pilot pollution analysis, and basic database and infrastructure problem corrections. Statistics and drive test data will be evaluated to determine if the network performance meets contractual warranty criteria. If so, then the optimization team should enter the final coverage drive and warranty test activity (Chapter 10). If the data does not pass the exit criteria, then necessary changes must be made to enhance system performance. Change recommendations generated by various engineers should be reviewed by the lead engineer to ensure consistency for the entire area being optimized. Each change made in the system should be recorded and archived so there is a history of each change and why it was made. These changes should be entered on a change request form or a "change order along with the PRM. The change request forms/orders are usually signed by a lead engineer and then given to the person(s) who will be implementing the change. A copy of the change order should always be kept by the optimization team in a central notebook or binder for easy reference. Three examples of these forms can be found in Appendix 9A. The results of these changes must be characterized by an iteration of data collection and analysis. This repetitive process continues until the exit criteria are passed.
Collect Data: DM, Scanner, SMAP, CDLs, Event Logs (Initial Set of Drive Test Data is Generated During Initial Coverage Test Chapter

8)

Process data. Generate plots. Update PRM. Chapter 9 Data Analysis: Neighbor List Updates, Cell Radius Checks, RF Coverage Repair, Pilot Pollution Clean Up, Infrastructure Troubleshooting, Phone Problems, Data Collection Problems

Does the network Meet the contract warranty Performance criteria?

Yes

Done, go to Chapters 10 & 11

Evaluate all recommendations. Implement best recommendation.

Figure 9.5-1: Overall Optimization Flow

149

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization

9.6

Analysis

The basic non-RF network optimization activity will be to ensure each cluster is integrated into the overall network, while the RF network optimization activities include neighbor list optimization, cell radius checks, RF coverage assessment, elimination of pilot pollution, and correction of database and equipment stability problems. These topics are discussed below. 9.6.1 Neighbor List and Cell Radius Checks Two of the most basic checks that should be done for each set of drive data collected are neighbor list prioritization and cell radius checks. Neighbor lists are important to enable the mobile to enter into soft handoffs with any PN offsets displaying adequate Ec/Io that may be present in an area. Cell radius checks are required to ensure that the mobile can use PN offsets with adequate signal quality to access the system. 9.6.1.1 Neighbor list (NL) Inaccurate neighbor lists (NL) can have adverse effects on the mobile operation. FER can be caused by strong, interfering pilots that are not scanned or added to the active set because they are not found in the neighbor list update messages sent to the mobile. The presence of various pilots in the same geographic area mandates that those pilots be in each others neighbor lists. Early in optimization, three primary data sources can be used to update neighbor lists. These data sources are the DM, pilot scanner, and call proc logs. The DM and scanner will collect information on pilots present in a specific area. Since markets may not have access to both a DM and pilot scanner, both methods are presented below. Call proc logs should not be used as a source of optimization data, therefore they are not discussed. Later in optimization (e.g. friendly user or commercial stages) neighbor list optimization should be done using CDLs. A relatively new tool called nlp.pl5 can be used to optimize neighbor lists using CDLs as the input. Reference information and this tool can be downloaded from http://www.cig.mot.com/~reimrsrr/. Since this chapter primarily addresses the initial optimization activities, nlp.pl5 is not discussed here. Chapter 11, which discusses network monitoring, provides additional guidance on the usage of nlp.pl5. 9.6.1.1.1 Neighbor List Optimization Using DM Data

Use of Dropped Call Data to Identify Neighbor List Deficiencies After a dropped call, look at the sync channel message to determine the PN offset the mobile used to re-sync itself. If this PN offset was not in the active set when the call dropped, or in the last neighbor list update message (NLUM), either the PN offset should be added to the neighbor lists of the active set or moved up to a higher position in the neighbor list. This will require looking at the messaging and/or using the tool called SOS. See: http://www.cig.mot.com/~reimrsrr/SOS.html for more information on SOS. Selection of which neighbor list(s) the missing PN should be added to will be based on relative distances and angles of the PNs serving the area where the call was dropped.

150

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization Use of Extended Handoff Direction Messages to Re-Prioritize Neighbor Lists The frequency of occurrence of Extended Handoff Direction Messages (EHDM) or Handoff Direction Messages (HDM), which are logged in the DM log files, can be effectively used to rank the neighbors in the neighbor list and also can indicate missing neighbors. This is done by calculating the amount of time and number of times a source sector spends in handoff with other target sectors, and from this information a prioritized neighbor list can be generated. This type of updating for the neighbor list can be done through tools like shotime, but one must be careful to make sure that data sampling is valid for making the recommended changes. The follow is a list of question one should ask before using drive test data to make neighbor list updates. - Is the data sampling size statically significant? - Do the drive-test routes provide a balanced representation of the sectors coverage area? - Was there any malfunctioning, or out-of-service sectors in the area when the driving was performed? - Are there hills, buildings, or other obstructions, which may explain the difference between a manually generated or simulated neighbor list and the indicated changes from the data sample? The shotime tool and reference documentation can be obtained from: http://www.cig.nml.mot.com/cdma/kctop/tools/shotime.html and http://tcsc.cig.nml.mot.com/~reisman/tools/sho_time.doc.txt 9.6.1.1.2 Neighbor List Optimization Using Pilot Scanner and MIL Pilot Analyzer

Pilot Analyzer Recommendations The Pilot Analyzer developed by MIL provides a variety of output recommendations, among them is neighbor lists. The "Pilot Analyzer: Design Overview and Users guide," can be found at: http://engdb.tlv.cig.mot.com/tools/PilotAnalyzer.html The following excerpt from MIL documentation provides theory of operation of how the pilot analyzer provides neighbor list recommendations: Neighbor List recommendations are generated for all sectors and based upon a binby-bin evaluation of HP scanner drive test data. For each bin, the analyzer will identify the best server seen in that bin. Then the analyzer will add all subsequent sectors above Tdrop to the neighbor list of the best serving sector for that bin. The neighbor list order will then be determined by keeping track of the number of bins a neighboring sector is seen. The user can set a Neighbor Threshold in percentage in the configure window. Neighbors with number of links less than this threshold will be removed from the list. Default recommendation is set to 1% or set to 0% and filter the list manually. When trying to generate a neighbor list the user will be asked to set up the maximum delta between the neighbor sector to the best server sector. After the Neighbor list has been generated it can be saved and loaded latter on. Loading Neighbor list is done via the File Menu and viewing the neighbors graphically can be done using the View Menu. 151

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization 9.6.1.2 Cell Radius Checks

The determination of proper cell radius settings is documented in Parameters and Optimization by Dillon/Anderson. See that document for details. 9.6.2 RF Coverage Problems This section provides general guidelines for classifying and correcting RF coverage problems. The emphasis is on improving or correcting those coverage problems though the implementation of SIF pilot power, antenna azimuth, or antenna tilt changes. Additional checks are included to separate out infrastructure, mobile, or data collection problems. Appropriate references to additional information are provided. The improvement of poor RF coverage areas can be classified into three basic cases. The first case is characterized by low mobile receive levels next to the cell site. The second case attempts to improve coverage at the outer bounds of the networks predicted coverage area. The third case involves correcting a localized coverage hole within the predicted coverage area, but not near a cell site. Once a problem area has been properly characterized, then corrective actions are recommended to resolve the poor coverage condition. The entrance criteria to investigate and correct poor coverage problems is that field data has been collected, processed, and the following images or plots are available: 1. Mobile receive (Rx) 2. Mobile transmit (Tx) 3. Mobile Ec/Io 4. Mobile Forward FER 5. Ec/Io per PN of interest. In addition, RF coverage prediction images and database information such as SIF pilot power settings, antenna configuration information (azimuths, tilts, beam patterns), and neighbor list information should be available. 9.6.2.1 CASE 1: Poor RF Coverage Next To Cell Site To start, review the mobile receive plot to identify any low Mobile Rx values near the cell sites. Low mobile Rx should be considered less than 85 dBm or values that are significantly less than the mean values near sites for the specific system under evaluation. A poor RF coverage pattern next to the cell site can be caused by obstacles such as buildings, trees/foliage, or terrain, or may be caused by antenna nulls in the vertical beam pattern that are appreciable and impact the signal strength. These conditions will tend to cause acute irregularities in the homogeneity of mobile receive levels as the mobiles distance increases from the site. In some cases, the antennas may be mounted directly on the top of buildings, but not close enough to the edges, thereby creating shadowing directly underneath the antennas. Antenna tilt angles should be verified, and the engineer should be aware of the antenna heights to determine if the receive value is reasonable for the equipment installation and configuration. The optimization engineer should work to identify if any obstacles could be the cause of low mobile receive levels and either work around or remove those obstacles if possible. 152

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization If obstacles are not determined to be the cause of the low mobile receive, one should next investigate possible infrastructure problems to determine if the site was off the air or not fully functional during the data collection period. This type of situation is evidenced in the mobile receive plot as lower than expected values around the entire site or in a specific sector. Of course some mobile receive energy will be indicated from distant sites, but the key is to look at adjacent sites and compare the mobile receive levels on the suspect site to others and determine if they are relatively consistent at similar distances from the cell sites. Event logs, alarms, and CDLs should be checked to see if any service affecting alarms were registered on the site, and whether or not the site was taking calls from the test ESN(s) while the data was collected in that area. Finally, the engineer should investigate database errors related to mobile receive levels, such as SIF pilot power settings. Additional information on how to troubleshoot these infrastructure issues is found in Section 9.6.4. The last item to check is whether the data collection equipment (including the phone), method or processes deployed in the data collection, and post-processing tools worked correctly. Some insight into these problems can be derived from Sections 9.6.6. 9.6.2.2 CASE 2: Evaluating Coverage At The Limits of the Predicted Coverage Area The best indicator to use to determine whether or not coverage can be improved at the limits of the network is mobile transmit power (mobile Tx). Typically the reverse link (mobile base station) is the limiting link due to power restrictions of the mobile. Refer to the predicted mobile transmit power required image and compare that to the measured data. If the measured and predicted data are within 6 dB of each other, and the mobile Tx level is high, (over +17 dBm), then the network can be considered to be nearing its coverage limit. To confirm whether the issue is truly a coverage problem, verify that the mobile receive and transmit level (path balance) comply to the IS-95 specification (Section 6.1.2.3) at the start of a call: Mobile Tx (dBm) = - Mobile Rx (dBm) 73 + NOM_PWR (dB) + INIT_PWR (dB) (NOM_PWR and INIT_PWR can be found in the database and are described in Dillons, Parameters and Optimization: http://www.cig.mot.com/cdma_ase/index.html If the measured data does not adhere to this equation within 6 dB, then there may be excessive interference on either the forward or reverse link, or there may be equipment problems that require investigation. If the data conforms to the guideline, then increasing the forward link SIF pilot power on sectors serving the coverage-challenged areas will not improve performance because the reverse link will not improve anyway. An antenna angle change (azimuth or tilt) might provide some incremental improvement, and this could be investigated. If the Tx power has some headroom available, than changing the SIF pilot power and/or antenna pointing angles may improve the problem area. Caution should be taken to ensure that other adjacent areas are not adversely affected and priority is given to areas competing for coverage as desired. Also, increasing SIF pilot powers and modifying antenna pointing angles to extend coverage areas can lead to overshoot problems that should be avoided (see Section 9.6.3, Pilot Pollution). Discussion below in Section 9.6.2.3 provides guidelines for dialing in SIF power levels and changing antenna pointing angles. Finally, with increased power and/or up-tilts one might need to increase 153

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization the cell radius in the database. For more information on cell site radius see Dillons, Parameters and Optimization http://www.cig.mot.com/cdma_ase/index.html. A clear understanding of the limits of a networks coverage is key to streamlining the optimization activity and to avoid wasting effort in areas that arent predicted to be covered. The initial coverage drive should have been designed to extend slightly beyond the predicted coverage area. Initial optimization should be conducted to maximize the networks measured coverage area, but it is important to not go past the point of diminishing returns. Subsequent drives do not need to be so generous after the coverage bounds have been maximized. Communication, understanding, and resolution of the measured versus predicted coverage bounds is imperative. Despite this, in some instances there is some justification to continue collecting data outside of the maximized, measured coverage area. For example, consider the need to travel from one area of coverage to another area of coverage along a specific metric route passing through an uncovered area. The area may be targeted for future cell sites, and collection of data along those routes would be helpful to identify the optimal number and locations of the BTSs. One caution related to this is that collection of data outside of the predicted coverage area will skew the statistics of the overall drive. Data collected in the uncovered areas should be removed from the reporting of statistics and performance trends, especially for warranty testing purposes (Chapter 10). 9.6.2.3 CASE 3: Poor RF Coverage Inside The Predicted Coverage Area, But Not Next To A Cell Site. The following 5 steps should be followed to diagnose and select specific PNs whose signals should be strengthened to repair coverage problems, then determine the required change in SIF pilot power or antenna pointing angle to repair the problem. Step 1: Verify Existence of Coverage Problem Areas The first step is to verify that areas exhibiting low Rx values and correspondingly high Tx values inside the predicted area of coverage are actually performing poorly. Again, Table 8.6.1.1, Relationship Between Number of Pilots serving an Area and Acceptable Mobile Receive Signal Strength, (in chapter 8) is a general guideline identifying acceptable receive levels based upon the handoff state of the mobile. In general, the more pilots serving a given area the greater the measured mobile receive power should be. Case 2 above identifies reasonable cutoffs for mobile Tx values. Once the areas have been identified, then the forward and reverse link Frame Erasure Rates (FER) should be evaluated. It is possible to have a low receive and high transmit power and still have good FER if noise (Io) is minimal in the area (e.g. only one server is present in an area). However, when low receive values are coupled with multiple servers, then these situations need to be investigated. Generally, areas where forward FER values exceed 2 percent should be investigated. Step 2: Identify Candidate PNs Serving, or Which Should Serve, The Problem Area At this point, one needs to get a general understanding of which sectors or PNs are, or should be, providing service in a given problem area. Also, one needs to evaluate the associated signal quality for those PNs. PN plots, plots of Ec/Io for various PNs, are 154

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization generated using the mobile diagnostic monitors measured and recorded temporal analyzer data. This data represents the three finger offset values or fields in the processed data. The individual sectors or PNs serving a particular problem area can be extracted from the mobile messaging by looking at the Power Strength Measurement Messages (PSMM). Make a list of all PNs and their corresponding Ec/Io values serving the problem area. There may be problems with specific sectors not transmitting or neighbor lists may not be accurate and therefore not reported by the PSMMs. These situations should be investigated and corrected. Step 3: Evaluate PN Plots for Sectors Serving the Problem Area For each PN serving the problem area, a PN plot should be generated. The PN plots for individual sectors should be examined to determine the reasonableness of each sectors coverage footprint. Examples of poor footprint patterns would include: 1. PNs that do not show up, especially along the main beam (bore), of the sector. 2. PNs that are inconsistently scattered throughout the intended coverage area. 3. PNs present throughout the intended coverage area, but consistently exhibiting Ec/Io values that are generally below Tadd. 4. PNs extending beyond the first tier of surrounding cell sites. (See Figure 9.6.3-4: Compas Plot illustration overshooting PN) PNs exhibiting any of these characteristics should be investigated to determine if obstacles are causing localized degradations in mobile receive/transmit levels that could be worked around by raising antennas, changing pointing angles, or increasing SIF pilot powers. Slight path imbalances can occur on the forward and reverse links due to multipath conditions caused by natural (terrain, trees) or man made (buildings) obstacles. A fair evaluation of the PN plots requires that the data being evaluated adequately cover the area of interest (sufficient drive routes). An incomplete data set might make a sector that actually has a good coverage footprint look like it is incomplete. Step 4: Identify Which PNs Can Be Improved Using Mobile Tx As Guide Similar to Case 2, the mobile Tx levels can be used to determine if the problem area can be improved. If the Tx value is above +17 dBm, and there is no interference, the mobile is having difficulty maintaining the reverse link. If the Tx plot shows marginal (+10 to + 17 dBm) or adequate (< + 10 dBm) values of mobile Tx, but the mobile Rx level is poor, then forward link improvements should be investigated. This is best achieved either through changes in SIF power values, antenna point angles, or a combination thereof. To determine which sites/sectors should be changed, one should consider the following questions for each site/sector serving the poor coverage area:

155

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization 1. Is the coverage problem very localized, and would re-directing the RF energy into the problem area with a change in antenna azimuth or tilt help? If so, which sector would provide the most improvement (based on distance between the candidate antennas and problem area and angle off main antenna beam bore)? 2. Is the coverage problem more widespread, and would a broader correction (increase) of SIF pilot power in the entire area be a better solution? If so, which sector would provide the most improvement (based on relative distances from the problem area and candidate servers)? 3. Whether the best correction is a change in antenna pointing angle or SIF pilot power, what penalty would be paid, in terms of degrading other areas, if the modification(s) were implemented? Modifying the database or antenna configurations to correct a problem could introduce negative impacts in surrounding or adjacent areas. Prioritization of coverage areas and maintaining pilot dominance (Section 9.6.3) are critical. Isolating Forward and Reverse Link Problems The mobile messaging sequence can be examined to determine if a coverage issue has resulted from a forward link or reverse link problem. Forward link deficiencies (high forward link FER) will be characterized by a high number of Power Measurement Report Messages (PMRM). See Dillons document describing normal forward power control operation. The PMRMs are typically accompanied by repeats in the reverse link (mobile base station) that requiring an acknowledgement (e.g. PSMM), receiving inconsistent or non-existent responses returned from the BTS. Repeat messages can be identified by deciphering the (acknowledgement sequence, message sequence, acknowledgement required) settings on IS-95 messages. Similarly, forward link deficiencies can be characterized from mobile logs by the following scenario. 1. The mobile is receiving messaging from the base station. 2. The mobile is sending acknowledgement responses back to the BTS for the messaging that require response. 3. Then the BTS sends the same message or messages with the same acknowledgement sequence number, but increments the message sequence number. For more detail on mobile messaging, see the CDMA Call Flow Examples in the CDMA Cellular Air Interface for IS-95A and TIA/EIA-95-B at: http://www.cig.mot.com/standards/CDMA_STDS/TIA_CDMA_STDS.html#ai Most of forward and reverse link information can be derived from examining the data collected by a DM. If additional verification is required, messaging collected at the BTS/CBSC can be logged using SMAP. SMAP will also provide reverse FER data. The use of SMAP will necessitate additional data transfer time along with processing time. http://scwww.cig.mot.com/~thakkar/smap.htm)

156

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization Step 5: Determining Power and Tilt changes SIF Power Changes Permanent changes in SIF pilot power should be directed toward affecting a widespread coverage problem area where a particular PN serves. The diagnosis should indicate that an increase of a particular sectors power level will not adversely affect surrounding or adjacent areas (i.e. introduce pilot pollution or non-dominant conditions). To make a noticeable change in the coverage, a minimum SIF change of 2 dB should be investigated. A rule of thumb would be to try increments of 3 dB to see useful improvements. It is recommended that the power differences between adjacent sectors on the same cell site do not exceed 6dB. This large variation could cause degradation in system performance when a CDMA system becomes loaded. For, allowable BTS Pilot Power Adjustment Ranges see Table 5-7 in the CDMA RF Planning Guide. This document can be downloaded from: http://www.pamd.cig.mot.com/nds/cts/rftech/public_html/Documents/RFPG2.1/rfguideV 2.1.html A precise estimation of what the SIF pilot power value should be can be arrived at by calculating the power at the BTS antenna, and using the mobile receive level to calculate the path loss. For equations related to calculating the path loss see Propagation Models in the CDMA RF Planning Guide. Using the calculated path loss information one can derive the required SIF power setting. Ideally, one would then run a simulation to verify that the new value does not have any negative impact. Antenna Pointing Angle Changes Redirecting antennas will reallocate the RF energy in a most efficient manner. This solution should be considered for acute, focused problem areas requiring substantial correction. In general, implementation of antenna angle changes is more timely and costly than SIF power changes. There also may be added complexity when CDMA antennas area shared with other cellular technologies, such as analog or TDMA networks. Options for electronic vs. mechanical tilting should be investigated. If the antenna redirection produces minimal results, the effort for the antenna redirection might not be justifiable. Investigation of changes in azimuth or tilt angles can be simulated, short term, by increases or reductions in SIF pilot powers. This will require a very focused drive test in the problem area; care must be taken on deriving conclusions from surrounding areas that may appear degraded or improved. If the problem area has been improved with the simulated antenna redirect, then the antenna redirection should be implemented, and the area re-driven to verify its performance. Calculations (described below) should be made to determine what change in SIF pilot power could emulate an antenna tilt angle change(s). For example, if an area at the edge of the coverage area (distant from the 157

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization closest site) required 3 dB of additional power to provide adequate Ec/Io, the SIF power could be increased that 3 dB to emulate an antenna uptilt. The two methods for antenna titling are electronic titling, using an N-element phased array antenna, or mechanical tilting by physically tilting an antenna. Down tilting, can help RF coverage holes near the site were there are nulls in the antenna pattern. Down tilts are also deployed to reduce pilot pollution by changing the RF propagation pattern or footprint. Up tilts are used to extend the RF propagation pattern or footprint. Up tilts, can be used in addition to SIF changes to extend the coverage of a sector. Calculating Antenna Tilt Angle Changes The optimization engineer can use three approaches to evaluate antenna patterns and determine a required tilt change. One should select between 3 and 5 key points to calculate the change in power distribution related to an antenna tilt. These points should be selected at different distances to understand the effects on coverage footprint the tilt change will induce. The three basic approaches are: 1. Use of Antenna Catalogs to estimate beam patterns. 2. Graphically using a simulation software such as NetPlan. 3. Numerically using Unix commands in conjunction with NetPlan antenna files. Use the information gathered from the sources above to complete the exercise below. a. Calculate the angle between vertical and the ray defined by the endpoints of the antenna location and the problem area location. The following equation can be implemented in Excel:
A = ROUND(-DEGREES(ATAN(((h+ges)-(getl+toh))/dtl)),2)

= Angle to target (degree);

= Ant_height

ges = Ground_elevation_at_site; getl = Ground_elevation_at_target_location toh = Target_object_height; dtl = Distance_to_target_location

b. Identify the antenna gain from the antenna beam pattern. In the NetPlan directory the antenna directory contains files related to the vertical tilt for each antenna type. For example, the file md8cr6xs8-5.v is for this antenna type at a down tilt of 5 degrees. This file contains a table that starts at 90 degrees and for each entry increments by one degree for the Angle to target. The data entries in the file are for the antenna vertical gain. The up tilt data files in NetPlan use the letter u in front of the angle. For example, the file name md8cr6xs8-u1.v would be for the same antenna types as given above, but represents an antenna tilt set at one degree positive. Use these files to look up the antenna vertical gain for the current antenna tilt and the tilt angle or angles under consideration. c. For angles to the target containing fractions of a degree, use interpolation between the rounded up/down angle to arrive at a vertical gain. 158

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization d. Finally, subtract the old tilt gain from the new tilt gain to get the gain difference for the tilt change. 9.6.3 Pilot Pollution This section contains procedures to help engineers identify and analyze pilot pollution and non-dominant pilot problem areas as well as determine solutions to eliminate or reduce the problem. The basic process for identifying pilot pollution and non-dominant pilots is the same. Pilot pollution (too many pilots) can be defined as the existence of four or more pilots with Ec/Io values greater than Tadd. To correct this problem the engineer needs decrease the amount of energy to the problem area. Lack of dominance can be defined as low Ec/Io levels, numerous pilots with similar values of Ec/Io, and four or more pilots above the Tadd threshold. To correct this problem the engineer needs to make up to three of the pilots in the area stronger or the other ones weaker. These changes will create pilot dominance in the area and reduce the number of pilots that appear in the active set, therefore reducing the amount of interference in the area. Examples given below primarily make use of Motorola tools (NetPlan & COMPAS). Other equivalent system planning and simulation tools providing the same information can be substituted. For a partial list of available planning and simulation tools refer to chapter 2; for other tools refer to chapter 6. Some ideas are also presented for markets that do not use planning and simulation tools. Details of the basic processes which are discussed below involve: 1. Verify the neighbor list is complete 2. 3. 4. 5. 6. 7. 8. 1. Verify there are no PN reuse issues. Create data table. Determine line of sight Identify overshooting sites Determine corrective action Evaluate recommendations Implement changes Verify the neighbor list is complete. See section 9.6.1 for a detailed discussion on neighbor lists and neighbor list tools. Verify there are no PN offset reuse issues. PN offset reuse problems occur when different sites/sectors, which provide overlapping coverage use the same PN offsets. In these cases, the MM may not make a correct decision of which BTS/Sector is being represented by a particular PN. Interference will occur when the MM assigns channels from the incorrectly selected BTS to a mobile within the overlapping coverage areas. The engineer analyzing the problem area should be familiar with the PN offsets assigned to the sites in that area.

2.

159

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization In addition, cell radius sizes should be consistent with the coverage footprint of different sectors. Cell radius sizes for various releases can be found on the parameters web page, and explanations of cell radius settings are contained in the CDMA RF Application Note: Parameters and Optimization, all located at http://www.cig.mot.com/cdma_ase/index.html. To investigate PN offset assignments and rule out any potential problems, the PN offsets assigned to a BTS can be graphically displayed using various tools. One method is to use the CDMA PN offset/handoff tool in NetPlan. By creating a text file which identifies each BTS, sector and corresponding PN offset, the engineer can display all the PN offsets for the BTSs in a system analysis. An example of the text file display is shown in figure 9.6.3-1 and an example of the graphical display is shown in figure 9.6.3-2. The text file should be put into the Compas directory of NetPlan. NetPlan - PN offset plan * Copyright(s) by Motorola Inc. * All Rights Reserved. * PN offset plan: Our_system *BTS/Sector PN 105/1 16 105/2 20. 105/3 24 103/1 4 103/2 8 103/3 12 124/1 28 124/2 32 124/3 36 124/4 40 124/5 44 124/6 48 Note: This is only a sample, the entire PN plan would have to be in put into text format. Figure 9.6.3-1: PN offset plan (text file)

160

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization

Figure 9.6.3-2: PN Output Plot in Compas The pilot analyzer tool can also be used to view all PN offsets in an area. The inputs to this tool are files from the HP pilot scanner. This tool can draw a plot of the drive route which can show areas of high FER, pilot pollution, and poor Ec/Io. For more information on this tool, see chapter 6 of this document, or the web page at http://engdb.tlv.cig.mot.com/tools/PilotAnalyzer.html. Table 9.6.3-1 shows the output of a pilot pollution area as detected by the pilot analyzer. Index Pilot 1 2 3 4 5 6 7 348 345 180 198 27 408 18 Ec/Io Pk. Ag. Samples Power Power -6.56 -70.69 -70.51 1 -9.80 -73.93 -73.93 1 -11.58 -75.71 -75.71 1 -13.53 -77.65 -77.65 1 -15.07 -79.20 -79.20 1 -15.13 -79.25 -79.25 1 -15.24 -79.37 -79.37 1 Table 9.6.3-1: Pilot Analyzer Output Delay 19.12 19.61 18.63 27.47 48.22 28.95 38.39 Spread 0.00 0.08 0.54 0.25 0.00 0.00 0.00

If a PN reuse problem is confirmed, the engineer should re-assign PN offsets for one of the conflicting sites. For additional information on PN planning refer to the CDMA RF Planning guide located at http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices.html, select the RF Planning button. 3. Create data table. Whether the problem is too many pilots or lack of a dominant pilot the beginning objective is to identify: A. All sites/sectors pointing into the problem area. 161

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization B. The distance from the sites serving the area to problem area C. Ec/Io values for PN offsets serving the problem area. A. List all sectors pointing into the problem area. This can be done by reviewing the mobile messaging collected within the problem area. The Pilot Strength Measurement Message (PSMM) contained in the mobile messaging lists the PN offsets serving that area as well as their pilot strengths. All PN offsets serving the area should be recorded in a data table, such as the ones shown in tables 9.6.3-2 and 9.6.3-3, along with their pilot strengths. Some corresponding mobile messaging follows. The first illustrates the lack of a dominant pilot in an area, four pilots are all within + 1.5 dB of each other and + 1.5 dB of Tadd. The second illustrates too many pilots in an area. In this case, seven pilots are seen, all greater than or equal to Tadd. The PSMMs have been highlighted for easy reference.

162

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization Table 9.6.3-2: Data table for Non-Dominant Pilots BTSSector & PN# 297-2 PN 156 260-1 PN 264 297-2 PN 152 298-1 PN 40 298-3 PN 48 Ec/Io Distance Improvement value from reqd in Ec/Io (dB) Sector to (dB) Problem area -12.5 2 Km 3.5 -14.5 -13.5 -14 -13 2.2 Km 2 Km 8.5 Km 8.7 Km 2 90 No Small hills Increase by 2 dB 4 Angle off Antenna Boresight in degrees 5 0 Antenna change? LOS or Terrain Issues? SIF Power change?

No 2 uptilt

No No

Increase by 3 dB No

Example 1: Lack of dominant pilot


23:50:40:881, FEB 08 1999, HANDOFF, PSMM, <--, 4, 0, 1, ENCRYPTION=0, REF_PN=156, PILOT_STRENGTH=-12.500000, KEEP=1, PN_OFFSET=264.0, PILOT_STRENGTH=-14.500000, KEEP=1, PN_OFFSET=152.0, PILOT_STRENGTH=13.500000, KEEP=1, PN_OFFSET=40.0, PILOT_STRENGTH=-14.0000000, KEEP=1, PN_OFFSET=48.0, PILOT_STRENGTH=-13.0, KEEP=1, 23:50:41:059, FEB 08 1999, HANDOFF, PMRM, <--, 4, 6, 0, ENCRYPTION=0, ERRORS_DETECTED=2, PWR_MEAS_FRAMES=36, LAST_HDM_SEQ=2, NUM_PILOTS=2, PILOT_STRENGTH=-13.000000, PILOT_STRENGTH=-14.500000 23:50:41:169, FEB 08 1999, HANDOFF, NLUM, -->, 0, 7, 1, ENCRYPTION=0

163

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization Table 9.6.3-3: Data table for Too Many Pilots BTSSector & PN# 295-2 PN 16 230-1 PN 104 219-3 PN 96 298-1 PN 40 298-2 PN 44 295-2 PN 12 297-2 PN 152 Ec/Io Distance Improvement value from reqd in Ec/Io (dB) Sector to (dB) Problem area -9.5 4 Km -12.5 -13.5 -8 -10 7 Km 10 Km 2 Km 2 Km -4.5 -3 -2.5 Angle off Antenna Boresight in degrees 20 45 25 5 90 3.5 downtilt Antenna change? LOS or Terrain Issues? SIF Power change

No No No No No No Multipath from building 3 downtilt Scatter from hills No No Decrease by 4 dB Decrease by 3 dB

-11.5 -9

5 Km 3 Km

-4

120 10

No

Example 2: Too many pilots


03:04:34:789, FEB 03 1999, HANDOFF, FOM, -->, 2, 0, 0, ENCRYPTION=0, USE_TIME=0, ACTION_TIME=0, ORDER=16, ADD_RECORD_LEN=0 03:04:34:803, FEB 03 1999, HANDOFF, PSMM, <--, 7, 4, 1, ENCRYPTION=0, REF_PN=16, PILOT_STRENGTH=-9.500000, KEEP=1, PN_OFFSET=96.0, PILOT_STRENGTH=-13.500000, KEEP=1, PN_OFFSET=40.0, PILOT_STRENGTH=-8.000000, KEEP=1, PN_OFFSET=152.0, PILOT_STRENGTH=-9.00000, KEEP=1, PN_OFFSET=104.0, PILOT_STRENGTH=-12.500000, KEEP=1, PN_OFFSET=12.0, PILOT_STRENGTH=-11.50000000, KEEP=1, PN_OFFSET=44.0, PILOT_STRENGTH=-10.000000, KEEP=1

The engineer should also include PN offsets for sectors that are physically pointing in towards the problem area and may not show up in the PSMM. This list will be used later to determine corrective action. B. List the distance (physical, i.e. miles/km) from the site broadcasting a particular PN offset to the problem area. The distance between the site and the problem area will help you prioritize which PN offsets require adjustments to the SIF pilot powers or antennas to eliminate the problem. The distance from the BTS to the problem area can be determined using the path profile feature in NetPlan, this distance can also be determined using topology maps or other system planning tools. The distances identified should be recorded along with information obtained in step A. 164

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization Figure 9.6.3-3 illustrates the path profile plot generated when using NetPlan. The X axis represents the distance between the cell site (the first latitude and longitude listed) and the location of the problem area or mobile (the second latitude and longitude listed). In addition the bearing from the cell site to the problem area or mobile with respect to due north is provided. The Y axis shows the height above sea level in meters with the thin line, and the antenna height is represented with the heavy line. The dotted line from the top of the antenna to the problem area or mobile is the Fresnel zone. The solid upper line is the direct line of site from the antenna to the problem area or mobile. The solid lower line represents the terrain.

Figure 9.6.3-3: NetPlan Path Profile Plot C. List the Ec/Io values for the PN offsets listed in steps A and B. Sites with Ec/Io values significantly below Tadd should be considered as sites on which to perform corrective action. 4. Determine Line of Sight. Determine if the sectors identified in step 3 above have unobstructed line of sight with the problem area. This can be done using the path profile feature mentioned in item B above (see figure 9.6.3-3), in NetPlan/COMPAS. This may also be done with other planning tools that check line of sight or by reviewing topology maps and site pictures that provide a view of surrounding buildings and/or terrain. If none of the above is available a site visit may be necessary. If sites have obstructed views which contribute to the interference problem they can be included on the list of sites to be considered for corrective action. 5. Identify overshooting sites. Identify any PN offsets that originate from more than one tier away from the site surrounding the problem area. If using COMPAS use the PN plot feature to plot individual PN offsets. See the COMPAS output in example 9.6.3-4. 165

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization

295
8 12

Nondominant server area


156

297
148 152

16

Pilot Pollution Area 219


88 96 92

298
40

260
264

48

44

40

230
272 268 104 112 108

Figure 9.6.3-4: COMPAS Plot Illustrating an Overshooting PN The pilot strength information and the PN distance information obtained in step 2 can be used to determine overshooting PNs by correlating the PNs identified in the problem area with the PNs that should actually be serving the area. PNs that overshoot their expected coverage area should be put on the list of candidates for corrective action. It is most desirable to provide coverage in all areas using the closest site. A data table such as the one shown in Figure 9.6.3-5 is helpful in keeping track of information obtained in the previous steps. It will assist the engineer in identifying candidate sites for corrective action.

166

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization 6. Determine Corrective Action. After reviewing the list of data collected from the previous steps the engineer must determine which PN offsets require corrective action in order to remove the pilot pollution. The key is to create an appropriate number of adequate pilots, preferably three. For cases of too many pilots, the objective is to remove the excessive numbers of strong pilots by increasing or decreasing the amount of energy of each pilot either by changing SIF powers or making antenna adjustments. In example 1 in section 3.A there are 7 pilots listed in the PSMM for the problem area. PN 16, pilot strength = -9.5, PN 40, pilot strength = -8.0 and PN 152, pilot strength = -9.0 are the three best active PN offsets illustrated by this example. If steps 1-5 have already been completed and the remaining four PN offsets are listed in the data table for corrective action, the engineer should decrease the total amount of energy provided by these site in the area with SIF power changes and/or antenna changes. For non-dominant pilots, the engineer needs to up to three of the pilots in the area stronger. In example 2 in section 3.A, there are five PN offsets in the PSMM. All five have relatively low pilot strengths. After completing steps 1-5, the engineer should select the strongest three pilots and increase the energy provided by them to the area. It may also be necessary to decrease the energy of the other PN offsets. After determining how many dB the various sectors must be raised or lowered, the engineer must determine whether an antenna adjustment or a SIF power change is the best solution to correct the problem. Refer to section 9.6.2 for information regarding antenna adjustments and SIF power changes. If neither a SIF power change or an antenna adjustment is effective by itself, a combination of the two may be useful. 7. Evaluate recommendations. Determine if the proposed changes will adversely affect other areas. Some things to check for: A. Will antenna/SIF power changes decrease/increase coverage for this site or adjacent areas? B. Will antenna/SIF power changes create pilot pollution problems in adjacent areas? After evaluating the results of proposed changes, decide which change will have the most positive effect and create the least amount of problems for the area in question and adjacent areas. Make recommendations. **Note: Proposed recommendations may require changes to the neighbor list if energy from various BTS/Sectors is redistributed.

167

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization 8. Implement recommended changes. Update the Problem Resolution Matrix and redrive the problem area. Process the data and re-evaluate to determine if problem has been resolved. If the problem still exists, repeat this process again until the pilot pollution is eliminated or no further improvements can be made. There are cases for which it may be impossible to completely correct the pilot pollution in an area. These cases should be handled on an individual basis, each being reviewed with the customer to determine the most acceptable solution. Additional information on optimizing pilot pollution can be obtained from the CDMA RF Application Note: Parameters and Optimization located on http://www.cig.mot.com/cdma_ase/index.html. 9.6.4 Infrastructure Issues During the optimization, problems may be encountered in areas where the RF coverage is adequate and Ec/Io for all pilots is good. When problems are encountered in good RF coverage areas, focus should turn to identifying potential infrastructure problems. Various indicators can help the optimization team isolate these problems. Those indicators primarily come from three sources: Drive team and/or CBSC operator reports Call Final Classification (CFC) Distributions Event Logs / Alarms

Evaluation of these sources of data can lead to isolation of the problems. Drive team reports that may indicate specific problems include: Access attempts that repeatedly fail in a particular area PN offsets that should be seen at a particular location are not registering on the DM Handoffs are not occurring at a particular location (inter-sector, inter-cell, interCBSC, inter-EMX) High RF Loss rate Consistent or predictable drops, after a certain period of time or in a specific area.

Assuming that the data collection equipment was operating properly, then CFCs should be examined to determine if there is any hardware or database configuration problems. CFCs can be used to troubleshoot a variety of infrastructure problems. Event logs contain site/sector status and alarm information that can be evaluated to determine the stability and functionality of the hardware, as well as provide indicators on the presence of interference in the system. A systematic approach to isolate possible infrastructure (database or hardware) configuration problems is represented below. The primary focus is on BTS and CBSC hardware and database issues. (The EMX is not discussed). Figure 9.6.4-1 shows the hierarchy of the CDMA system. A more detailed diagram can be found in the System Commands Reference Volume 4 Device State Management (Figures 12-2 to 12-4). 168

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization Figure 9.6.4-1 System Architecture Overview
PSTN

Public Phone Network

OMC-R EMX

Switch

(Operations Maintenance Center- Radio)

CBSC
XC MM

(Transcoder)

(Mobility Manager)

BTSLINK

BTS GLI CSM BDC LCI BBX MCC

EMX Switch or MSC relays call status information and voice between the CBSC and Public phone system. OMCR Application allowing engineers to manipulate the database, load and manage BTS software, and monitor system performance data. XC Transcoder converts STRAU data to QCELP format. MM Mobility manager handles call processing functions for mobile stations. BTS Cell Site communicate over IS-95 air interface link with mobile stations.

169

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization 9.6.4.1 Common Drive Team Failure Reports The drive team will occasionally report problems to the system engineers. The drive teams written test logs can be used to verify where and when the problems occurred. Table 9.6.4.1-1 contains a list of common drive team problems. Along with these symptoms, likely causes and suggested follow-up investigations are presented. Symptom Access attempts are repeatedly failing in a particular area Likely Cause(s) BTS is off the air. Mobile subscriber access class (SAC) is set incorrectly. BTS is off the air. PN offsets in database have been changed or are incorrect. Antennas are hardwired incorrectly. GPS is not working. CSM is not working. Neighbor lists are inaccurate. Routing tables are incomplete. A BTS is off the air. Mobile antenna location and orientation problem. Mobile subscriber access class discrepancy. Call setup failure in infrastructure. Investigate Check device status, and alarms. ACCOLC in CBSC and in mobile. Device status, alarms, and CBSC database (SECGEN).

PN offsets that should be seen at a particular location are not registering on the DM

Handoffs are not occurring at a particular location (intersector, inter-cell, inter-CBSC, inter-EMX)

Check BTS hardware and cabling configuration. Check alarms and device status; follow up with site visit by CFE. Review NL, XC-sect database and routing tables.

Call drop rate is extremely high; (Mobile Rx value may be lower than normal).

Calls drop or fail to access traffic channels consistently after a specific period of time.

Check alarms and device status. Review equipment configuration with drive team. Check ACCOLC of mobile and CBSC database. Check device status and alarms.

Table 9.6.4.1-1: Drive Team Problem Reports and Likely Causes

170

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization 9.6.4.2 Problems That Can Be Diagnosed Using CDLs/CFCs The use of Call Detail Logs (CDLs) at the CBSC can help narrow down the time when hardware or database problems were experienced. CDLs should be correlated with events such as drop calls and access failures. The following steps will outline the procedure to collect, process, and analyze CDLs. 1. Data Collection CDLs located and downloaded from the OMC-R in the /sc/cdl/ directory. These files will be named in the format of cdl and date (i.e. cdl.YYYYMMDDHH) and should be FTPd to the engineers local computer. 2. Data Processing The CDLs can be processed using cdl_browse and the CDL Analyzer Tool (CAT). Find these tools and usage documentation at http://www.cig.mot.com/~wheelrts/cat1_5.html 3. Data Analysis The CAT will create several reports that will effectively show the CFC distribution and highlight any abnormal levels of CFCs that may indicate an infrastructure problem. The reports generated by the CAT can be used to find potential problems with the infrastructure configuration or database and are described in detail at http://www.cig.mot.com/~wheelrts/CAT1.5/docs/analyzer.doc. Table 9.6.4.2-1 is a normal distribution of CFCs for one CBSC. A normal distribution will vary between markets due to configuration differences. However most optimized systems have CFC distributions where typically CFC 13s will be less than 9s will be less than 5s. If there is an unusually high percentage of non-RF related CFCs (see below 9.6.4.2 #2) the CFC in question should be investigated and reported to the CBSC and database engineers.

171

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization


CFC 1 3 4 5 7 8 9 13 15 24 26 27 28 60 62 130 255 Raw Total % of Call Total 184608 84.77% 428 0.20% 4705 2.16% 2423 1.11% 32 0.01% 166 0.08% 31 0.01% 2 0.00% 807 0.37% 7787 3.58% 16220 7.45% 60 0.03% 10 0.00% 319 0.15% 187 0.09% 1320 0.61% 10 0.00% 217785 100.00%

SUB

Table 9.6.4.2-1: Normal CFC distribution CFCs 24 and 26 are considered good calls. A CFC 26 typically happens when a mobile is paged but not answered, a timer at the EMX expires and disconnects the call. Including the 24s and 26s our call completion rate goes up to 95.8%. If a particular drive test has a high number of abnormal CFCs, i.e. not within the normal CFC distribution as seen in Table 9.6.4.2-1, follow the steps below to troubleshoot the cause of the CFC. 1) Align the drive test log sheet data with the CDLs and determine the CFC for the problematic call. In order to align the CDLs with the mobile data, first look at the drive test logs and identify where and when the problem occurred. Notice in this example the first failed call was at 8:23 a.m. Also note the dialed digits, ESN, and mobile ID from the drive test logs. Next look through the CDLs for fields with data that match the ESN, ACCESS_TIME, ACCESS_BTS, ACCESS_SECTOR, ENTRY_TYPE, and DIALED_DIGITS. Once the closest match is found, the CFC from the CDL will illustrate the type of failure the drive team was experiencing. See Figure 9.6.4.2-1 (example of drive team log sheets with FTOs) and Figure 9.6.4.2-2 (the corresponding CDL) for an example of correlating these two data sources. Look at the highlighted areas in the drive team log sheet and the CDL, again notice that the access times do not match. The access and disconnect times will not be the same between mobile data and CDLs. This is due to the mobile data time source being GPS where the CBSC time source is an internal clock. 2) Determine what type of failure the mobile is having. In the example below a CFC 9 is seen as the failure. In order to determine whether the failure is database or hardware related, the engineer should become familiar with what each CFC means. The CFC Resolution document describes in detail the most common CFCs seen during system 172

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization optimization, what equipment those CFCs relate to, and possible solutions to alleviate each particular CFC. This document can be found at http://www.cig.mot.com/~bohrer/cfc1.2.pdf.

173

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization

Date: Zone:

3/19/99

MOBID: Direction &: Area

440-782-7444
CBSC 1
Location

ESN:

086FC1C82
No D- Op Se An A Er rvi alo H/ r ce g O

Operator

Bob Testerman 090-2897-9961


Comments

Phone Manufacturer & Version: V O X FTOFTT RF Qu al.

Number to Dial

D8
Ca ll Ty pe # of Ca Ty ll # pe 1/ 1 2/ 2 3/ 3 DM Time

Call Proc Time Ref #

Active(s)

File Name

M-L M-L M-L

4th & Roswell St.


08:17:00 08 : 19:00 08: 21: 00 08:23:00 08: 25: 00 08:27:00 : : : : : : : : : : : : : : : : : : : : : : : : : : :

1 1 1 X X X X

214, 218, 320 222, 324, 418 332, 418, 124 124 124 128

0319990817.qlc
.

4 & Fleming Ave. 4th & Becker Ln. Becker & Austin Way Becker & 23rd

th

0319990819.qlc . 0319990821.qlc . 0319990823.qlc


.

RF Loss on Becker & Ranch FTO Weak Ec/Io -13 FTO Weak Ec/Io -19 FTO Weak Ec/Io -17
Pulling over to call lead engineer

L-M 1/ 4 M-L M-L M-L 4/ 5 5/ 6 6/ 7

0319990825.qlc
.

Becker & 30th


: : : : : : :

0319990827.qlc
. . . . . . .

L-M 2/ 8 M-L M-L M-L 7/ 9 8/ 10 9/ 11

L-M 3/ 12

Figure 9.6.4.2-1: Drive Test Log Sheet to correlate with CFC 9 problem.

174

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization


BROWSE CDLLOG MSI-4407827444 99-03-19 15:19:50 omc MM-1 L000000.00000 010938/051106 CDL:1 "Call Detail Log" CDL_SEQ_NUM=0x2aba LAST_RF_HIGA2_INTERVALS=0 CALL_REF_NUM=0x0c46 LAST_RF_HIGA2_BEGIN=0x0000 CBSC=1 LAST_RF_HIGA2_END=0x0000 MMADDR=0xe1 LAST_RF_HIGA2_COUNT=0 XC=1 LAST_RF_HIGA2_TEMP=0x0000 CPP=4 LAST_RF_SETP2_INTERVALS=0 MID=4407827444 LAST_RF_SETP2_BEGIN=0x0000 ESN=0x86fc1c82 LAST_RF_SETP2_END=0x0000 SCM=0x62006200 LAST_RF_SETP2_COUNT=0 MOBILE_PROTOCOL_REV=3 LAST_RF_SETP2_TEMP=0x0000 DIALED_DIGITS=09028979961 LAST_RF_CONN1_MMADDR=0xe1 ACCESS_TIME=8:19:33 LAST_RF_CONN1_BTS=214 ACCESS_PN_OFFSET=124 LAST_RF_CONN1_SECTOR=3 ACCESS_STR=0x0243 LAST_RF_CONN1_SSECTOR=0 ACCESS_CHANNEL=76 LAST_RF_CONN1_3SECTOR=0 ACCESS_BTS=214 LAST_RF_CONN1_4SECTOR=0 ACCESS_SECTOR=3 LAST_RF_CONN1_5SECTOR=0 ENTRY_TYPE=0 LAST_RF_CONN1_6SECTOR=0 SERVICE_OPTION=0x0003 LAST_RF_CONN1_MCC=1 NEGOTIATED_SO=0x0003 LAST_RF_CONN1_ELEMENT=9 LAST_MM_SETUP_EVENT=25 MCC_RELEASE1_TIME=0xe1d2 CIC_SPAN=18 LAST_RF_HIGA1_INTERVALS=0 CIC_SLOT=1 LAST_RF_HIGA1_BEGIN=0x0000 XCDR=0x1e06 LAST_RF_HIGA1_END=0x0000 INIT_RF_CONN_BTS=214 LAST_RF_HIGA1_COUNT=91 INIT_RF_CONN_SECTOR=3 LAST_RF_HIGA1_TEMP=0xe177 INIT_RF_CONN_MCC=1 LAST_RF_SETP1_INTERVALS=0 INIT_RF_CONN_ELEMENT=9 LAST_RF_SETP1_BEGIN=0x0000 INIT_RF_CONN_CHANNEL=76 LAST_RF_SETP1_END=0x0000 CFC=9 LAST_RF_SETP1_COUNT=255 RELEASE_TIME=8:19:40 LAST_RF_SETP1_TEMP=0xe0b3 XC_RELEASE_TIME=0x0000 FIRST_MAHO_TIME=0x0000 INIT_MM_REL_EVENT=7 FIRST_MAHO_CAUSE=255 ONE_PILOT_COUNT=0 FIRST_MAHO_ACT_STR=0x00 TWO_PILOTS_COUNT=0 FIRST_MAHO_CAND_COUNT=0 THREE_PILOTS_COUNT=0 FIRST_MAHO_CAND1_PN=0 FOUR_PILOTS_COUNT=0 FIRST_MAHO_CAND1_STR=0x00 FIVE_PILOTS_COUNT=0 FIRST_MAHO_CAND2_PN=0 SIX_PILOTS_COUNT=0 FIRST_MAHO_CAND2_STR=0x00 LOC_S_ADD_COUNT=0 FIRST_MAHO_CAND3_PN=0 LOC_SR_ADD_COUNT=0 FIRST_MAHO_CAND3_STR=0x00 LOC_S_DROP_COUNT=0 INIT_MAHO_TIME=0x0000 LOC_SR_DROP_COUNT=0 INIT_MAHO_CAUSE=0 EXT_S_ADD_COUNT=0 INIT_MAHO_ACT_STR=0x00 EXT_SR_ADD_COUNT=0 INIT_MAHO_CAND_COUNT=0 EXT_S_DROP_COUNT=0 INIT_MAHO_CAND1_MMADDR=0x00 EXT_SR_DROP_COUNT=0 INIT_MAHO_CAND1_BTS=0 BETTER_ACTIVE=0 INIT_MAHO_CAND1_SECTOR=0 NUM_SR_SHUFFLE=0 INIT_MAHO_CAND1_STR=0x00 NUM_BTS_SHUFFLE=0 INIT_MAHO_CAND2_MMADDR=0x00 NUM_S_SHUFFLE=0 INIT_MAHO_CAND2_BTS=0 LOC_S_PILOTS_REL=0 INIT_MAHO_CAND2_SECTOR=0 LOC_SR_PILOTS_REL=0 INIT_MAHO_CAND2_STR=0x00 EXT_S_PILOTS_REL=0 INIT_MAHO_CAND3_MMADDR=0x00 EXT_SR_PILOTS_REL=0 INIT_MAHO_CAND3_BTS=0 RELEASE_L_CE=0 INIT_MAHO_CAND3_SECTOR=0 RELEASE_L_WC=0 INIT_MAHO_CAND3_STR=0x00 RELEASE_E_CE=0 LAST_MAHO_TIME=0x0000 RELEASE_E_WC=0 LAST_MAHO_CAUSE=0 NUM_SHO_FAILURES=0 LAST_MAHO_CAND_COUNT=0 FIRST_SHO_FAIL_TIME=0:00:00 LAST_MAHO_ACT1_MMADDR=0x00 LAST_SHO_FAIL_CAUSE=0 LAST_MAHO_ACT1_BTS=0 LAST_SHO_FAIL_TIME=0:00:00 LAST_MAHO_ACT1_SECTOR=0 LAST_HO_BLOCK_CAUSE=255 LAST_MAHO_ACT1_STR=0x00 LAST_HO_BLOCK_TIME=0:00:00 LAST_MAHO_ACT2_MMADDR=0x00 LAST_HO_BLOCK_PN=-1 LAST_MAHO_ACT2_BTS=0

Figure 9.6.4.2-2: CDL Log Correlating to Drive Team Log Sheet

175

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization If the CFC in question appears to be an infrastructure issue, review the CBSC log sheet and look for any maintenance or outages related to maintenance. Also review alarm reports and BTS status reports generated at the OMCR. Table 9.6.4.2-2 contains a list of common optimization problem symptoms along with likely causes and suggested follow up activities. Symptom Possible CFC Possible Problem Investigate Individual Devices or Parameters
BTS hardware status Cell Radius Mobile Search Windows

High FTOs Access attempts are repeatedly failing in a particular area

3, 5, 6, 7, 9,

Site off the air. BTS device is OOS. Too far from site.

11 23,

Lack of resources

50 103, 109, 255

MM, MSC, or XC problem. Data call problem.

Wrong PNs PN offsets expecting to be seen at a particular location are not registering on the DM

3, 5, 6, 7, 9, 13, 15, 255 These CFCs may appear due to accessing a neighboring site with no good RF signal.

BTS or BTS devices are OOS. BTS database is not correct.

BTS and CBSC device status. Device checks should include. Alarms and Event Logs Site Radius and Search Window size. Blocked calls from lack of hardware resources. XC outages or configuration problems. Links between MM, BSC, and MSC Data call hardware status. Device status and alarms for the specific BTS. SECGEN (list of PNs) for BTS being tested. BTS antenna install reports (if available)

MCC Usage, XC usage, mobile access class MM Transcoder Switch IWU CDP CPP BTS devices such as BBX, BDC, and MCC.

BTS antenna configuration.

Antenna cabling inside the BTS.

Table 9.6.4.2-2: Optimization Problem Troubleshooting Table (start)

176

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization


Failed H/Os Handoffs are not occurring at a particular location (inter-sector, intercell, inter-CBSC, inter-EMX) 8, 10 Poor RF conditions in Hard H/O area. MCC problem with Hard H/O. Neighbor Lists (XCsects) RF coverage Device status PN offsets Device Status of EMX Device status and configuration Neighbor Lists BSC status XC status Database settings. Device status. SECTOP (Neighbor List) SIF Powers MAHO/ DAHO parameters SECGEN

26 29, 104 108, 130 133

High Drop Rate Call drop rate is extremely high; (Mobile Rx value may be lower than normal or mobile Tx may be higher than normal).

3, 4

EMX configuration problems Data Call disconnected. Inter-CBSC H/O failure XC failure One site may be OOS. Poor neighbor list Site parameters may be incorrect. Poor BTS antenna configuration. RF coverage Switch configuration. Data call device malfunction Devices at BSC Transcoder problem Site may be OOS. Too far from BTS Mobile overload class is not set correctly. Problem with interCBSC database or hardware. Switch problem Data call device malfunction Transcoder problem

IWU, CDP, CPP, status XC-sects ICTRKGRP BTS status SECTOP SIF Powers XC status and configuration BTS antenna orientation. Mobile Rx & Tx levels MSC, BSC XC-sect IWU, CDP, CPP SPAN BSC XC XC-sect BTS device status. Search Window size Cell Radius Mobile ACCOLC. Service option parameters on BSC. ICTRKGRP XC-sect tables, Neighbor lists. EMX IWU, CDP, CPP SPAN XC XC-sect

27, 28, 29 104 107 130 133

Device status Device status and configuration Device status

Consistent Drops Calls drop or fail to access consistently after a specific period of time or in a specific place.

3, 4, 8, 10

Device status. Access parameters not set correctly. Service option negotiation settings. Inter-CBSC handoff database. Device status Device status and configuration Device status and configuration

15

27 29

104 109 130 133

Table 9.6.4.2-2: Optimization Problem Troubleshooting Table (finish)

177

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization 9.6.4.3 Problems That Can Be Diagnosed Using Event Logs and Alarms Data Collection Event Logs can be downloaded from the OMCR (located at /sc/event/ ). The raw event log files are in ASCII format and do not need to be processed by any tools. Data Analysis These logs will show all system events that occurred during each hour. Reviewing the event logs during the hour in which call failures occurred can provide helpful information. Alarms are rated as follows: No star = Warning * = Minor ** = Major *** = Critical Major and critical alarms often can be service affecting. For example, if during one drive test a specific cell site could not do hand in or hand outs, a CSM (clock synchronization module) event or alarm may be seen when looking at the event logs for that site during the time frame of the handoff failures. An alarm such as this may be seen:
ALARM:1-10030 Severity = Critical Description = CSM Lost Phase Lock Category = CSM/MAWI

If there are failures during the drive test and any pertinent events or alarms are found, correlate the specific alarm with the drive test logs. The specific event may have caused the drive test failure if the alarm was generated in that time period. There are many events and alarms logged. A complete list of all events and alarms can be found at http://www.cig.mot.com/~dimeo/all.alarms in comma delimited text format. The System Output Messages Reference gives a detailed description of what the events and alarms mean. This document can be downloaded from http://www.cig.mot.com/CIG/IviewDocs/cdrom2/cci/www/colls/sc_pf/226a252.coll. As these event logs can be very large, it is useful to use UNIX commands such as grep to pull specific events from the event logs. If an alarm indicates that a device went out of service (OOS) for a period of time, scroll down through the event logs and verify that the device was repaired and put back into service, otherwise notify the lead CBSC engineer and lead System engineer immediately. The following hardware hierarchy should be used in the BTS troubleshooting process: BTS->BTSLINK->GLI->CSM->BDC->LCI->BBX->MCC. BTS related problems should be escalated to the CBSC engineers and Cellular Field Engineers (CFE). Again relay as much detail as possible when escalating the problem.

178

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization 9.6.4.4 Problems That Cannot Be Diagnosed or Resolved If a problem cannot be diagnosed or resolved, it should be reported back through the proper channels. Table 9.6.4.4-1 shows the areas where problems may be seen by the RF optimization team and who they should contact. Any problems should be noted in the PRM and the Optimization team leader or lead engineer should be informed with as much detail as possible. HARDWARE Escalate To A CFE should be contacted to remove and replace the board/part. Fill out change request form (see appendix 9A) and note in PRM. Notify CBSC engineer and CFE and note in PRM. Notify CBSC engineer and note in PRM. SOFTWARE Take mobile out of testing pool. If possible, send back to manufacturer or reload software and retest. 1. Look at FYIs* and other available information from CNRC. 2. Call CNRC at 800-433-5202 or 847-632-5390 to speak with a person to obtain more information if necessary. 3. If this problem has not been reported to development and should be, open an MR**. Notify CBSC engineer.

Problem Area/Item BTS

Link CBSC/XC/EMX Mobile BTS Card

CBSC/XC/EMX

*FYIs can be found at http://mcsc.cig.mot.com/Search/, click in the check box before FYI For Your Information. Choose to sort the results by score, date or doc and choose how many results per page to be displayed. If desired enter a start date and end date and product name, then click the Search button. **To open an MR, follow the instructions found at http://scwww.cig.mot.com/SC/sw/BuildGroup/www_sablime/sabfaq/sabfaq.html#a5-3. Table 9.6.4.4-1 Problems Seen and Escalation Procedures 9.6.5 Subscriber Unit Issues Call statistics are typically used to gauge network performance and partially verify the contractual warranty performance agreement. Sometimes the statistics can be skewed due to one or more poor performing mobiles. Identifying and removing any problematic mobiles will more accurately represent system call statistics. The performance should be monitored for all mobiles used in drive testing. Sometimes one mobile may be testing in particularly poor RF coverage, therefore do not assume the mobile is bad by looking at only one drive test. In most cases the drive test team will know if a mobile is performing abnormally poor. The drive team should be trained to notify a system engineer if a mobile is having a high number of failures.

179

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization There are several ways to determine if there is a bad mobile, one of the most effective ways is to use the CDLs and the CAT. This process should be done even if the statistics appear to be within a normal range. It is important to know that all mobiles are performing within a similar statistical range. 1. Run the CAT on CDLs from the mobiles used in the optimization drive tests. Review the output files esn.t20, esn_cfc.dst, and call_dur.dst to determine if the problem is a subscriber unit issue. A. The esn.t20 (ESN Top 20) looks at CFCs 3, 4, 5, 6, 7, 9, 13, 27, 61, 62, and 255 for each ESN. It also produces a Top 20 list for all access failures. For each CFC ranked statistics are provided in two distribution categories, Highest raw totals and Highest percent. Figure 9.6.5-1 is an excerpt from esn.t20. This example demonstrates how the two rankings, Highest raw totals and Highest percent are split up.
Unique ESNs = 32 Top 20 list for CFC dist on ESN ESN RAW PERCENT Highest raw totals for CFC= 3 63e60c83 2 16.67 21300c97 2 20 43b0c81 2 10 b5de0c83 2 10 9c9b0c83 2 18.18 8be20c97 2 7.692 7f1d0c97 1 33.33 7c020c97 1 20 Highest percent for CFC= 3 431d0c83 1 100 2abe0c83 1 100 bbf70c83 1 100 80ca0c97 1 100 359f0c97 1 50 b1520c83 1 50 7f1d0c97 1 33.33

Figure 9.6.5-1: Excerpt from esn.t20 from CAT

180

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization B. The esn.t20 file has the above format for CFCs 3, 4, 5, 6, 7, 9, 13, 27, 61, 62, and 255. A poor performing mobile can be found by reading through this file and paying attention to its CFC distribution. In the following example, Figure 9.6.5-2, note the statistic for ESN=77a00c97. This mobile has approximately 70% of its calls resulting in an access failure or drop call. The mobile should not be used for call statistic testing until its performance is investigated. The Highest percent sections were left out of this example, due to number of call attempts for each ESN sometimes the mobile may appear in Highest raw totals but not in Highest percent. If the mobile was used for continuous call testing where the call is left on traffic for very long periods of time or until dropped, there may be an unusually high percentage of drops (CFC 4).
Highest raw totals for CFC= 4 77a00c97 11 23.33 3c000c97 10 29.41 269d0c97 9 32.14 77420c97 8 42.11 51390c97 7 11.29 7190c81 7 35 Highest raw totals for CFC= 5 73020c82 53 20.23 77a00c97 14 25.485 ae3e0c81 11 45.83 38cb0c81 10 15.62 403a0c97 10 6.098 44b90c81 9 11.69 Highest raw totals for CFC= 9 69300c83 9 52.94 77a00c97 8 21.86 1e4b0c97 6 25 6fae0c83 5 20 403a0c97 5 14.29 73020c83 4 57.14

Figure 9.6.5-2: Excerpt from esn.t20 for bad mobile C. The esn_cfc.dst (ESN CFC distribution) gives a total break down for each mobile in the system. The e option should be exercised when running the CAT to create this file. D. The call_dur.dst (call duration distribution) breaks the call duration for good calls, dropped calls, setup failures, hard handoff calls, and all calls into eight time bins. It is useful to compare good call duration with drop call duration to see if the drops are consistently happening in one time frame. Use the p option when running CAT to create this file. For more information on these reports, please see the CDL Analysis Tool Users Guide, available for downloading from http://www.cig.mot.com/~wheelrts.

181

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization


Call Duration Distribution (s is seconds) Call Duration Distribution for CBSC/MMADDR = e1 All Good 1499 867 802 573 565 400 368 375 370 204 202 574 560 68.381 Drop 1274 33 8 7 5 2 12 Setup 38 0 0 0 0 0 0 HHO 187 32 0 25 0 0 2 Range 0 0 - 15 s

15 - 30 s 30 - 45 s 45 - 60 s 60 - 90 s 90 - 120 s 120+ s 41.712 Average Duration

71.805 60.581 5.353

RFloss / Minute = 2.077 FWD_QUALITY ave = 0.0083 RVS_QUALITY ave = 0.0330

Figure 9.6.5-3: Excerpt from call_dur.dst from CAT 2. If a bad mobile is identified, notify and escalate the problem to the lead system engineer responsible for drive test logistics. A comparison test should be done using the suspected bad mobile and a mobile that is known to be void of performance problems. If the mobile has call origination or drop statistics more than 7 to 10 percent worse than the control mobile, the bad mobile should not be used for drive testing and investigated. A new software flash or hard reset can sometimes repair the mobile performance. After the mobile has been fixed or reset the system engineer should perform an additional comparison test to verify the mobile is performing within normal statistical ranges. Mobile problems that have been identified in different markets include incorrect message sequences, messaging not being incremented, and service option negotiation issues. These are only a few examples of mobile failures.

182

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization 9.6.6 Data Collection/Processing Troubleshooting There are two main types of data collected during a drive test: mobile files (or DM files) and CBSC files (SMAP, CDLs, Event Logs, etc.). Upon occasion, the collected data sets may be incomplete or be corrupted. This section identifies the most common causes of incomplete or corrupted DM or CBSC data, and provides guidelines to troubleshoot and prevent these data collection problems. 9.6.6.1 Mobile Data Problems 9.6.6.1.1 Identifying Mobile Data Problems Mobile data problems may be suspected if any or all of the following are seen: 1. Data Processing Errors. Figure 9.6.6.1.1-1 below shows an example of the error window in Compas. For the example shown, no error were encountered. However, if there is a problem with the data, the processing run usually aborts and indicates the type of error encountered. If a different post-processing tool is used, check the users guide to see if error logs are generated, where they are stored, and the name of the logs. 2. Corrupt Files. Compas generates a file called <datestamp>.FIL which lists all files processed. If a file is corrupt, the file will be listed as an unknown type of file. 3. File Size Differences. The mobile files are different sizes after transferring them from the DM onto the processing computer. For example, a mobile log file is 6,495 KB on the DM but only 4,950 KB after being transferred onto the target computer. 4. Number of Files Differences. The number of mobile files transferred from the DM do not match the number listed on the mobile logs. 5. Missing Data. The data processes, but upon examination of the data, the engineer discovers there is missing data from a particular ESN, or entirely missing ESNs. See Figure 9.6.6.1.1-2 below for an example of missing data. 6. Data Processing Crashes. The post-processing tool crashes or will not process the data.

183

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization

Figure 9.6.6.1.1-1: Example of Error Window in Compas

Missing Data

Figure 9.6.6.1.1-2: Example of Missing Data

184

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization 9.6.6.1.2 Preventing Mobile Data Problems Typical problems seen in the field with mobile data are grouped into four classes as follows: GPS Problems DM Problems File Transfer Problems Field Operator Error Problems Each of these is discussed below. GPS Problems 1. The GPS was not connected properly during the test. 2. The GPS had no power or was not turned on. 3. The GPS used was not compatible with the DM software, or 4. The GPS was broken. The GPS should always be checked before any metric route or test to make sure it is powered up, receiving the satellite signal, and working properly; see the checklist in Appendix B-1. Chapter 7, Single Cell Functional Test has a basic diagram showing how the DM and GPS should be connected. Some GPS manufactures include a script or program with their GPS for the user to check that the GPS is receiving the satellite signal. This program usually can also set the baud rate and other parameters for the GPS to operate correctly. Check with the GPS manufacture/reseller (see Chapter 6) for more information. DM Problems There are numerous reasons the DM can have problems. As with any computer, there may be conflicts with different programs loaded which may cause spontaneous crashes or lock-ups. This should be tested as much as possible in the office prior to field data collection. Since the DM operates in a rugged field environment, it will be subjected to varying temperature, humidity, vibration and shock conditions over its lifetime. As a result, problems with connections, hard drives, and displays may occur more frequently, especially with more use. Care should be taken to secure the DM, GPS, and power equipment, along with the attached power and signal (serial data and RF) cables to reduce impacts of road testing so that DM lock-ups or crashes are minimized. Despite these precautions, the data files should be saved at regularly scheduled intervals to minimize any re-drive activities in the case of a computer failure.

185

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization File Transfer Issues If the files are transferred from the DM onto a networked computer, care should be taken to transfer the files correctly. For example, when FTPing DM files across a network, they should be transferred in binary mode. Before transferring the DM logs, the hard drive space on the processing computer should be checked to make sure there is enough room to transfer all the DM logs. This may be done using the UNIX command df -k or by looking at the space available on a DOS or Windows machine. See the computer users guide for specific instructions. If there is not enough space to transfer the files, space must be created. This can be done on a UNIX system, by using the tar, compress and/or gzip commands. On a DOS or Windows based machine, space can be created by compressing or deleting older files or moving them off the machine onto a backup source such as tape or Zip disks. Operator Errors The items below are the most common operator errors that should be avoided during the data collection process. 1. Operator does not save the logs at the recommended intervals. 2. The drive space is not checked on the DM and there is not enough room on the hard drive to save all the mobile logs. 3. The logs are not saved properly. 4. If there are problems with the DM, such as crashes or lock-ups, they are not noted on the mobile log sheets. 5. The operational status of the GPS was not checked before the test began. 6. Any problems with other equipment (i.e. electrical problems, antennas broken, etc.) were not noted on the mobile log sheets. 7. Log sheets were not kept accurately (i.e. different number of mobile files on DM than on log sheet). Checklists, such as the one in Appendix B-1 of this document should be generated and used before each test to develop good data collection habits. Mobile log sheets should be compared with the land operator sheets (if applicable). Also log sheets should be inspected at random intervals by overseeing engineers to ensure that the data collection crews are keeping accurate records. 9.6.6.2 CBSC Data The data gathered at the CBSC should be transferred over a network onto a post processing machine. Only the SMAP files contribute to the generation of RF performance plots, specifically, the reverse FER. Other data, such as CDLs, event logs and PM data, will be used for more detailed troubleshooting and to generate statistics that can be used to track improvements in system performance. Because the data collected at the CBSC plays such an important role in system optimization it is critical that this data be error free.

186

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization SMAP operation requires configuration files which identify the data the tool is to collect when it is turned on. SMAP must also be invoked and shut down properly. Take care to follow the specific SMAP operation procedures users guide to collect valuable data. SMAP should also be monitored while it is running to ensure that the file sizes are increasing while it is running. The two primary causes of corrupted CDLs, event logs, and/or PM Data are: not transferring the files correctly (i.e. Binary mode for FTP), and not having enough hard drive space for the files on the target machine.

Since these data sets can get very large, it is recommended that one or two persons be designated to perform daily management of this data. This task would include the transfer of the data, checks of the hard drive space where the data will be transferred, and corrective actions to provide disk space as necessary. Those corrective actions could include deletion of older, unneeded data, or data archiving onto other devices. 9.6.7 Integration of Each Cluster into the Overall Network A checklist like the following one should be used to ensure each cluster is integrated into the overall network by implementing Inter CBSC Soft Handoffs (ICSHO). This should include checks for each level of equipment; CBSC, XC, and EMX. 9.6.7.1 CBSC Level The implementation will be different depending on if this is a new system with multiple CBSCs, and existing system adding another CBSC or if this is an existing multiple CBSC system. The main items in the CBSC to be checked are: IC Trunk Groups are created Neighbor lists (XC Sectors) are correct Correct or additional equipment is installed (e.g. MSI, KSW, and FEP cards) IC Spans are equipped IC Links are installed and equipped The OMC groups are established and correct The CBSC ID is correct

9.6.7.2 XC Level This feature impacts several MMI commands required to establish the XC database. These commands are: Target System Number A source CBSC must have knowledge of a SCAP System field in order to be able to route SCAP messages to the target CBSC. FEP Address This field appears as the two left-most digits of the FEP Site Configuration Flag required for the XC EQUIP_DEVICE FEP command. ICLDL Equipage A new XC device must be equipped to correspond to each ICLINK equipped. Inter-CBSC Traffic Channels The XC command ADD_ICTCH_BY_SPAN is used to add multiple Inter-CBSC traffic channels to a span line.

187

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization 9.6.7.3 EMX Commands If the anchor handoff method selected is not KEEP_SOFT, then the following EMX commands are required to allow the anchor handoff to take place. BSS BSSRTE This database entry on the EMX contains the Destination Point Code for A+ messaging and the BSS Trunk Group for the terrestrial circuits to the target BSS. BSS CELRTE This entry on the EMX tells the target EMX where to find the target BTS.

More information on each of these can be found in the Cellular Application Note for Inter-CBSC Soft Handoff. This can be downloaded from the following web address: http://www.pamd.cig.mot.com/nds/cts/rftech/App_Notes/icsho/icsho.html. There is also additional information regarding this feature.

9.7
1. 2. 3. 4. 5.

Exit Criteria
Problem Resolution Matrix is current and reflects all work completed to date. Neighbor list has been optimized. Each individual cluster is integrated into the overall network. Final parameter recommendations have been implemented. RF performance plots show acceptable performance throughout cluster.

188

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization

Appendix 9A Change Request Forms and Change Orders


Date: ________________ PARAMETER CHANGE REQUEST FORM Name Requestor Title

MM/DD/YY Date Needed: Date Changed:


Describe changes wanted

Notes

Are these changes Permanent or Temporary? ________________________________________

Why? _______________________________________________________________________

MIB parameter before change: ______________________ MIB parameter after change: ______________________

Date checked: _____________ Date checked: _____________

189

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization SYSTEM OPTIMIZATION CHANGE ORDER Date

Cluster # Engineer on Drive Data Files Drive Team Equipment Drive Type Call Type Comments

Recommended Changes Cust MOT Change Cust MOT Change Cust MOT Change Cust MOT Change Cust MOT Change Cust MOT Change Cust MOT Change Cust MOT Change Changes Made Change Change Change Change Change Change Change Signatures RF Engineer RF Manager

Change Order # Change Order # Change Order # Change Order # Change Order # Change Order # Change Order #

Date Date

190

Optimization Procedures and Guidelines, Ver 1.2 RF Network Optimization

Customer Name CHANGE ORDER


Cluster # Drive date

Comments

Changes Requested Parameter changed: Change Change Change Change Change Change Change Change Change Change Change Requested by:

From:

To:

Date Completed by: Date

191

Optimization Procedures and Guidelines, Ver 1.2 Final Coverage Survey & Warranty Verification

10.0 Final Coverage Survey & Warranty Verification


10.1 Description
The closing step in network optimization is to conduct a drive test to verify that the network meets contractual warranty clauses. Some customers may want to perform this final drive under load, which can be simulated on the forward link, by employing SMAP to generate OCNS. The relationship of this activity to the overall network optimization process is shown in Figure 10.1-1.

Optimization Preparation
Equipment Installation and Test Verification (Chapter 3)

Network Design Verification (Chapter 2)


Accurate Terrain, Clutter, Xlos Tuning Data System Design via NetPlan/CSSS

RF Parameters Database Verification (Chapter 4)

Spectrum Clearing (Chapter5)

Data Collection and Analysis Tools Selection, Install, Test (Chapter 6)

Network Optimization Single Cell Functional Test (Chapter 7) Initial Coverage Test (Chapter 8) System Optimization and Detailed Problem Resolution (Chapter 9) Final Coverage Survey and Warranty Testing (Chapter 10)

System Operations (Chapter 11)

Commercial Service: Network Performance Monitoring and Expansion (Chapter 11)

Figure 10.1-1: Relationship of Final Coverage Survey and Warranty Testing Activity to Entire Optimization Process After several iterations of drive testing where all hardware and database problems have been eliminated and the system parameters have been modified to provide optimal network performance, the final drive will document the performance prior to hand over to

192

Optimization Procedures and Guidelines, Ver 1.2 Final Coverage Survey & Warranty Verification the customer. Up to this point, the PRM should have the history of network optimization changes documented for each cluster in the network.

10.2 Tools Required


The tools listed in Table 10.2-1 can support the final coverage survey and warranty verification activity. Item DM, with phone and GPS SMAP Description Used to collect data Used to collect reverse link messaging and if needed, simulate loading on the forward link using OCNS. Users Guide and configuration notes can be found at: http://www.cig.mot.com/~thakkar/smap.html Used to process the collected data, generate plots. Run on *.map files created by COMPAS to verify all contract warranty performance criteria has been met. http://www.rochellepark.pamd.cig.mot.com/~dhelm/ Used to verify system stability during final drive exercise Used to generate call completion and drop rates

Post Processing tool ASSIST Script

Event logs CDLs

Table 10.2-1: Tools Required for Final Coverage Survey/Warranty Verification

10.3 Personnel Required


Personnel Skills* System Engineer Green Belt Driver Valid drivers license DM Operator Computer literate CBSC Operator Knowledge of OCNS and SMAP CFE (on call) Knowledge of BTS hardware and troubleshooting capabilities *See Appendix A for more information. Table 10.3-1: Personnel Required

10.4 Entrance Criteria


1. 2. 3. 4. 5. RF optimization team understands contract warranty requirements. Problem Resolution Matrix is current and reflects all work completed to date. Neighbor list has been optimized. Final parameter recommendations have been implemented. RF performance plots show acceptable performance throughout cluster, and engineering team is confident that warranty performance will be met. 6. Data collection, processing, and analysis method(s) plus exit criteria have been discussed and agreed upon with the customer. 193

Optimization Procedures and Guidelines, Ver 1.2 Final Coverage Survey & Warranty Verification

10.5 Procedure
The final coverage survey and warranty verification consists of the following steps: data collection, data processing and generation of reports special evaluation(s) for contractual warranty certification final documentation of network configuration and performance

10.5.1 Data Collection and Processing The data collection and processing procedures will be the same as previously laid out in Chapters 8 and 9. Drive routes should be agreed upon with the customer, and may be more detailed than previous metric routes. The only difference at this point may the addition of Orthogonal Channel Noise Source (OCNS) to simulate load on the forward link. OCNS allows simulation of a load on the forward link. SMAP BTS sessions can be configured to provide this forward link loading. SMAP installation and configuration notes are available at the following URL: http://www.cig.mot.com/~thakkar/smap.html. If OCNS will be used, be sure that the CBSC operator properly inhibits the MCC channels that will generate OCNS, then start the BTS sessions in SMAP to generate OCNS. The CBSC operator shall monitor the sessions throughout the day to make sure the sessions are still running and there are no problems. Take necessary precautions to turn off OCNS and return the MCC channels back to proper operation after the data collection is completed. 10.5.2 Contractual Warranty Certification The customer and Motorola have defined the exiting performance criteria in the customer contract prior to any network optimization. A copy of the contractual requirements can be obtained from the account team that is handling that market. There are several ways to determine if the system meets the contract warranty performance criteria. One way is with the ASSIST script. This script is run on the *.map files created by COMPAS. This script can be obtained from http://www.rochellepark.pamd.cig.mot.com/~dhelm/ choose the link titled RF Warranty Tools Page. ASSIST processes bins from all sets of drive data to determine if the system performance meets all of the RF warranty criteria. Assist creates three reports; drop call report, FER report and the O/T (origination/termination) report. Examples of each follow. DROP CALL REPORT: This data is based on the file /users/dhelm/TOOLS/drive_test/sprint/testbed/sampling/call_ state.map

The data has been compare to the following Covered Area Inclusion Mask: /users/dhelm/TOOLS/drive_test/rf_warranty/testbed/include.m ap

194

Optimization Procedures and Guidelines, Ver 1.2 Final Coverage Survey & Warranty Verification

****************************************************** The following calculations take into account the CAIM ****************************************************** Number of Setups: 20 Drop Call: 5.00% Channel Element Usage = 1.28 Soft Handoff Usage = 22.5% Number of RfLosses: 1

FER REPORT: The FER is calculated from this file: /users/dhelm/TOOLS/drive_test/sprint/testbed/cont/results.m ap The data has been compare to the following Covered Area Inclusion Mask: /users/dhelm/TOOLS/drive_test/rf_warranty/testbed/include.m ap ***************************************************** The following calculations take into account the CAIM ***************************************************** Number of Bins with both FFER & RFER: eight Number of Bins with either RFER or FFER over 2%: 3 Percent of Bins with both FFER and RFER under 2%: 62.50% Number of FFER Bins: 8 Number of FFER Bins over 2%: 2 Percent of FFER bins under 2%: 75.00% FFER Average per Bin: 0.84% FFER Average with the worst 10% removed: 0.84% Number of RFER Bins: 8 Number of RFER Bins over 2%: 1 Percent of RFER bins under 2%: 87.50% RFER Average per Bin: 1.30% RFER Average with the worst 10% removed: 1.30% OT REPORT:

195

Optimization Procedures and Guidelines, Ver 1.2 Final Coverage Survey & Warranty Verification WARNING: only 26.32% of the origination/termination bins match CAIM See section 5.4 of the Users Guide for troubleshooting notes You will need to use the following output file: orig_term.bins This will be stored in the output directory: /users/dhelm/TOOLS/drive_test/rf_warranty/testbed This data is based on the file /users/dhelm/TOOLS/drive_test/sprint/testbed/ot/call_state. map The data has been compare to the following Covered Area Inclusion Mask: /users/dhelm/TOOLS/drive_test/rf_warranty/testbed/include.m ap *********************************************************** The following calculations take into account the CAIM *********************************************************** Setups: 5 Setup Failures: 3

Call Completion Rate: 40.00% 10.5.3 Final Documentation Prepare a Final Cluster Binder. The binder should contains a summery report of all parameters captured from the CBSC along with all the parameters change request submitted by the analysts during system optimization. The Problem Resolution Final Report starts with a listing of each drive test and the date on which the drive test was performed. The report contains the different areas identified as having RF performance problems and gives a description of the area, observations, and conclusions. Final recommendations are provided for each area after completing the final drive. It is recommended to keep an extra copy of the final binder for future references. Appendix 10A contains an example of the Final cluster Binder.

10.6 Analysis Conducted


Both Compas and ASSIST run in Unix. The ASSIST user guide has been attached to this chapter as appendix A. Review the results given by the Assist reports. If the data does not meet all contract warranty performance criteria, analysis must be done to determine the cause. It may be necessary to revisit Chapter nine for detailed problem resolution. If warranty criteria are met review the results with the customer and provide them with the final binder. The account team should also keep a copy of the final binder for future reference.

196

Optimization Procedures and Guidelines, Ver 1.2 Final Coverage Survey & Warranty Verification

10.7 Exit Criteria:


1. 2. 3. 4. Final coverage survey is completed. The final reports listed in Section 10.5.2 have been generated. Special contract warranty tests have been passed. An exit review has been conducted with the customer.

197

Optimization Procedures and Guidelines, Ver 1.2 Final Coverage Survey & Warranty Verification

APPENDIX 10A
This appendix provides examples of the Final cluster binder. EXAMPLE: 1. Problem Resolution Matrix 2: System Performance Report This section of the report will contain drive statistics for the last drive. This should include CDL breakdown by CFC, and the drive team statistics that include total call attempts, call completion rate, and drop call rate. If the system is commercial, this section should have System Performance Graphs that include total call attempts, call completion rate, and drop call rate. Also for a system that is already commercial, it should include a CFC summary at the CBSC level and a CFC Summary at the BTS level. 3: Site Database This section should contain a print out of the "BTS Information. Make sure that all of the database files, such as the tilts and SIF powers, have been updated. In order to see all your BTS data on this web page, you need the following databases. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Power: do a "DISP BTS-### PPSGAIN" on the CBSC for all of your BTSs, then call the file "APOWER.ZCT" (where 'A' is the specific unit and 'Z' is the Customer). ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Tilts: Put a file called "Current_Tilt_List" in the following directory: /netplan/cdma/tilts/ZCT/ the format should be as follows: *** CDMA ANTENNA TILT LIST *** Fri Sep 4 16:22:07 JST 1998 BTS# SEC# CURRENT REQUEST SCHEDULE REQUEST TILT TILT /COMPLETED BTS_Name unit -----------------------------------------------------------------------1 1 -1 -1 o/ Site 1 B_3 1 2 3 3 / Site 1 B_3 1 3 4 4 / Site 1 B_3 1 4 4 4 / Site 1 B_3 1 5 2 2 6/17 Site 1 B_3 1 6 5 5 o/o Site 1 B_3 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Antenna height: An export from NetPlan Administration to obtain an "Antenna.unl" file.

198

Optimization Procedures and Guidelines, Ver 1.2 Final Coverage Survey & Warranty Verification ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ One will also need to get some of the basic system parameters using the following script on the OMCR at the CLI command prompt. #!/bin/ksh input=$1 bts=allstatus cbsc=$input |grep BTS- |cut -c5-8 for parm in PPSGAIN MAHO SECGEN TCHGEN SECTOP ROUTENUM do echo "$parm" for CBSC in $bts do displayrc bts-$CBSC location |grep $CBSC |grep -v bts |cut -c1-43 displayrc bts-0 $parm for sector in 1 2 3 4 5 6 do displayrc carrier-$CBSC-$sector-1 $parm |grep CARRIERdone echo done echo done echo displayrc CBSC-$input XCSECT echo echo echo echo "Task completed." echo "good-bye !!!" This can also be accomplished through use of the Show All Parms Script located at http://www/rochellepark.pamd.cig.mot.com/~blashkar/bestpractices.html

4: Parameter Change Requests This should include all Parameter Change Request forms and Tilt Request forms. 5: System Plots This section should include the following Forward FER initial drive Rx initial drive Tx initial drive Ec/Io initial drive Delta initial drive (Optional) Forward FER final drive

199

Optimization Procedures and Guidelines, Ver 1.2 Final Coverage Survey & Warranty Verification Rx final drive Tx final drive Ec/Io final drive

200

Optimization Procedures and Guidelines, Ver 1.2 System Operations

11.0 System Operations


11.1 Description
Prior to commercial service, the customer often designates a period of friendly user trials on the system. Friendly users are issued registered phones and asked to give feedback on their calling experiences (i.e. do they drop calls at a particular area every day, etc.). A checklist, such as the one in Appendix 11.A, can be given to the friendly users to help streamline their problem reporting process. This user information will help determine if there are areas that may need further optimization due to the load on the system. It may identify areas that the drive teams did not include on the drive route which have problems, or identify other problems not previously found during optimization. The feedback from the friendly users may indicate it is necessary to add a cell site into the system. Since this site is being added into a live system, extra precautions must be taken so there is no down time of the system. A CDMA cell site addition procedure is given in section 11.5.2. During the friendly user stage and commercial service, system monitoring is required at the CBSC level for several reasons. These include: Monitoring system performance as the number of users increase Continue to isolate and remove bad hardware from the system Baseline new product releases and privates/patches as they are installed Separate effects of specific subscriber unit models as they are added to the system Specific tools and procedures to monitor the system will be discussed later in this chapter. The relationship of these activities to the overall network optimization process is shown in Figure 11.1-1.

201

Optimization Procedures and Guidelines, Ver 1.2 System Operations

Optimization Preparation
Equipment Installation and Test Verification (Chapter 3)

Network Design Verification (Chapter 2)


Accurate Terrain, Clutter, Xlos Tuning Data System Design via NetPlan/CSSS

RF Parameters Database Verification (Chapter 4)

Spectrum Clearing (Chapter5)

Data Collection and Analysis Tools Selection, Install, Test (Chapter 6)

Network Optimization Single Cell Functional Test (Chapter 7) Initial Coverage Test (Chapter 8) System Optimization and Detailed Problem Resolution (Chapter 9) Final Coverage Survey and Warranty Testing (Chapter 10)

System Operations (Chapter 11)

Commercial Service: Network Performance Monitoring and Expansion (Chapter 11)

Figure 11.1-1: Relationship of Friendly Users, Commercial Service and Performance Monitoring Activities to Entire Optimization Process

202

Optimization Procedures and Guidelines, Ver 1.2 System Operations

11.2 Tools Required:


Item Vehicle Description and Vendor Preferably a van with enough room for all data gathering equipment plus DM operators. Should be equipped for drive testing, including power source and routing for external antennas if necessary. Which can be used as a DM containing large hard drive (e.g. 2 GB), compatible with DM and GPS H/W & S/W, phone interface Capable of collecting IS-95 messages in different modes of operation (e.g. Markov, various rate sets) Position locating receiver compatible with DM software and laptop computer. Phone must have valid ESN and phone # for the current system. Used for coordinating activities with test leader and contacting MTSO personnel or CFEs as needed, or for emergency purposes. Used for collecting messaging and Reverse Link FER at the BTS/CBSC. COMPAS, OPAS or equivalent that will enable plotting of RF performance characteristics Used to verify system stability as part of post-processing and analysis. Network performance statistics that are displayed from the CBSC Used to verify system stability during system optimization Various (examples: PC with Exceed, Xterm, Sun Sparc, etc.) Recommended Quantity 1 per team

Lap Top Computer

2 per drive test vehicle

DM Software

1 per Laptop

GPS CDMA Phone with extra batteries or power adapter Analog or other non-test phone

1 per DM 1 Per DM 1 per vehicle

SMAP RF Performance Analysis Post Processing Tool CDL Analysis Tools PM Reports Event Logs Unix capable terminal on the LAN with the OMC-R

1 per MM Dependent on number of engineers. 1 1 per CBSC 1 per CBSC Dependent on number of engineers.

203

Optimization Procedures and Guidelines, Ver 1.2 System Operations

11.3 Personnel Required:


Type DM Operator Driver CBSC Engineer Cellular Field Engineer Simulation Engineer System Engineer Skill Level Good computer background, capable of operating DM and CDMA phone. (See Appendix A.4) Valid drivers license. Must keep safety and comfort of data collectors in mind at all times. (See Appendix A.6) Strong in Unix operations and experienced in Motorola infrastructure equipment. (See Appendix A.8) See Appendix A.9 Green Belt through Blue Belt as appropriate (See Appendix A.2 A.3) White Belt through Blue Belt as appropriate (See Appendix A.1 A.3)

11.4 Entrance Criteria:


1. A system is set up to take friendly user feedback and pass information to the engineer responsible for coordinating drive team activities. 2. Friendly users have been briefed on how to give feedback and are issued a registered phone. 3. Drive teams are available for real-time troubleshooting and/or maintenance window support. 4. CFEs are available as needed. 5. There are at least three friendly users per set of antennas to have a statistically valid sample.

11.5 Procedure:
This chapter covers three activities that are happening simultaneously. Each has its own procedures to follow. Each is discussed below. 11.5.1 Friendly Users Trial Period Procedure A checklist may be used to insure all logistics are in place for the friendly user trial period. Some items to include are: Determine users (number and who). Determine what areas to put friendly users in (e.g. downtown only, an entire county, etc.). Obtain phones and program correctly, and distribute to users. Ensure questionable areas have adequate loading to see if problems arise during user trials.

204

Optimization Procedures and Guidelines, Ver 1.2 System Operations 11.5.2 Cell Site Addition Procedure 11.5.2.1 Design, Installation and Component ATP Identify PN assignments for the new cell site. Design Engineer responsible. Identify Initial Power outputs. This can be accomplished using either predictive methods like NetPlan/CSSS or by predriving the area. Power output changes may need to be made to both the new site and surrounding sites. Design Engineer and System Performance responsible. Identify Topology. Create neighbor lists for new site(s) and surrounding sites. See chapter 9 for a discussion on neighbor lists. Design Engineer responsible. Build the site into CBSC. Write database scripts. Load database and verify database loaded properly by displaying all parameters entered. Put all devices into PRECUT state to differentiate this site from sites that are in commercial service. If devices are put into OOS state, operations personnel may try to bring the site into service prematurely and interfere with commercial service. In addition, PRECUT sites are not included in CEM availability calculations. Operations group responsible. Physically install the BTS and ancillary equipment. This includes installing and testing spans, antenna systems and BTS. Cell Techs and Motorola CFEs will typically be responsible for these tasks. Conduct the Single Cell Functional Test. Verify cell site functionality. Verify power outs, signal strengths, clockwise, counter-clockwise softer handoffs, and check originations/terminations on all sectors and channel elements. Drive teams or CFEs responsible. Make sure optimization tools are available. SMAP, DM with GPS navigation, post processing tools (COMPAS, OPAS, etc.). Create drive routes for new site and the surrounding sites that may be impacted by the new site. Conduct initial coverage survey along drive route (see chapter 8 of this document). Conduct metric drive along drive route (access completions, dropped calls). Process and analyze data to determine problem areas. Collect forward link data (Ec/Io, FFER, and Mobile receive power). Turn on SMAP to collect reverse link data (RFER, Mobile transmit power at BTS) if desired. SMAP should not be turned on in a heavily loaded system. Use in lightly loaded maintenance window. Troubleshoot problem areas. Look for breakage in performance in the coverage area of the new site as well as the surrounding areas due to pilot pollution. If three active pilots or less, make sure pilots are in neighbor list. If four or more pilots in active set at any one time, try to adjust pilot powers or downtilt antennas to control interference. If mobile transmit power is near maximum examine terrain for shadowing and reverse link path limitations that cannot be overcome. See chapter 9 for complete troubleshooting and analysis procedures. Make database changes as necessary.

11.5.2.2 Site Integration, Field Optimization (Maintenance window activity)

205

Optimization Procedures and Guidelines, Ver 1.2 System Operations 11.5.3 System Monitoring Procedure Detailed information can be found in the System Performance Monitoring Document. The basic procedure for performing system monitoring will be as follows: View the Worst 15 Cells report from pmsum for a list of worst performers. Check RF losses and accesses failures for the worst performers. Check the cem report for any outages and alarms for hardware failures. Check the worst performing cells for BBX Reverse Noise Rise Alarms. Check the output of pmtraf for traffic blocking. Check the output of pmmcc for channel element failures for the worst performing sectors. If the PMSUM reports are displayed on a web server, check the trending graphs for any degredations in performance (unusual rises in RF losses, access failures, CPU utilization). If the PMSUM reports are displayed on a web server, check the HTML list of Worst 15 Cells and click on an individual sector to get trending graphs on a per sector basis.

11.6 Analysis Conducted:


Analysis of problem areas will be treated the same way as outlined in Chapter 9 of this document. After the addition of a cell site into a live system, the preliminary analysis should be the same as shown in Chapter 7 Single Cell Functional Test of this document. The analysis thereafter would follow the normal guidelines set out in chapter 9 of this document. Analysis procedures for the system monitoring done at the CBSC level can be found in the Guide to CDMA System Monitoring document found at http://www.rochellepark.pamd.cig.mot.com/~blashkar/bestpractices.html, chose the Operations button.

11.7 Exit Criteria:


1. The system is functioning at a high level (e.g. 99% call completion rate, 2% drop rate, no major problem areas, etc.). 2. The PRM has no open issues that can be resolved. (There may be open issues that cannot be resolved. These should be documented, the reports given to the customer and the information entered into the PRM.) 3. The system is ready for commercial use.

206

Optimization Procedures and Guidelines, Ver 1.2 System Operations

Appendix 11.A Friendly Users Checklist


1. Be aware of the CDMA service boundaries. Digital to analog hand-downs are not drops and should not be categorized as a trouble incident. 2. Tell the customer of any known problem areas and what they may experience. Also, suggest that they dont send trouble tickets for known problem areas since you have already identified it. 3. Ensure the call is a digital call, not an analog call. If there is cross talk, static, or noise that sounds like the paging channel, the call is on analog. Look for the digital D on the phone display prior to the end of the call. If the call was on analog, the D still may appear immediately after the call has terminated. 4. Changes in volume are usually associated with the speaker of the Qualcomm phone. It has only one hole in the ear piece. The alignment of the ear piece and the ear is very critical. A subtle change in this alignment will result in a decrease in volume to the user. 5. Make some mobile to mobile calls and calls to analog mobiles. The majority of calls should be land to mobile and mobile to land calls. On mobile to mobile calls, if there is a problem, there may be a question of which leg is faulty. 6. Avoid the use of a speakerphone on the land side and the mobile side. Reported mutes, echoes and clips can be caused by the speakerphone. 7. Set all the phones to alarm on a drop or a fail to originate, then they will definitely know if they accidentally hung-up vs. a drop/fail. (Fails/Drops will sound an alarm hang-ups will be silent.) 8. On access failures check that the correct number of digits were dialed. 9. Report audio problems accurately. The audio problems associated with digital systems consist of warbling and muting. Static, continuous noise, crosstalk are analog related impairments. The audio problems generally associated with digital handsets are echo, noise bursts (screeches), and one way audio. 10. Create a list of the mobile user names and their mobile numbers to help track user performance. 11. Accurately record the time of the call so we can trace it to a Call Detail Log record. Time tags at the instant the drop or problem occurred are important so we can review the CDLs and alarm/event messages for information that may lead us to the source of the problem, especially if it is hardware related. 12. Complete all the data fields on the Trouble Ticket forms.

207

Optimization Procedures and Guidelines, Ver 1.2 System Operations 13. If a problem occurs, especially drops and access failures please try to reproduce the problem. In addition, if a dropped call occurs, the subscriber should attempt an origination at the spot they dropped. That would provide us information about the availability of a strong pilot in the area where they dropped. Please report if an origination was successfully or unsuccessfully completed after a drop. 14. Be cognizant of the location of your competitors cell sites and your cell sites that are analog only. The digital phone (this also applies to analog phones) can get overloaded by strong out of band signals and affect your service. 15. Document the software release of each handset. There should not be any old software versions because of associated problems.

208

Optimization Procedures and Guidelines, Ver 1.2 System Operations This page intentionally left blank.

209

Optimization Procedures and Guidelines, Ver 1.2 APPENDIX A Roles and Responsibilities

APPENDIX A Roles and Responsibilities


Several functions have been identified to support the processes that lead to system commercialization. Though there are many roles mentioned, it is possible that more than one role could be performed by an engineer. The role and a brief description of responsibility of each responsibility are as follows:

A.1

White Belt (System Engineer Entry Level)

A.1.1 Pre-Requisites General Knowledge of Cellular Technology and Deployment Process Technical Degree Basic Computer Skills (Unix, MS office, etc.) Programming Skills a Plus (i.e. Perl, C) NES109: Propagation Fundamentals, LMPS 3 days SYS300: CDMA System Optimization- 5days GNL070: SC Product Family Overview 2 days GNL 170: Digital Technology Overview - 1day GNL 180: CDMA IS-95 Implementation and Operations 2 days GNL 190: CDMA Call Processing 2 days PER350: NetPlan + CDMA 1 4 days PER310 : COMPAS 1 day ENG722 UNIX Fundamentals 3 days Introduction to Perl Self Study NSS Optimization Process Self Study Best Practices Documentation available on the Web

A.1.2 Recommended Training (Motorola training courses)

A.1.3 OJT/Experience Participate in small scale RF Optimization activity with Level 3 engineer. Activities may include: Drive test and Analyze data File Transfers Directory Structures Parameter Modifications/Database awareness BTS integration awareness Identify RF Problems Generate/review Neighbor Lists Interpret Performance Management Reports Drop Call, FTO, FTT analysis

210

Optimization Procedures and Guidelines, Ver 1.2 APPENDIX A Roles and Responsibilities Utilize NetPlan-COMPAS Generate Drive Test Plots Utilize available scripts/tools Awareness of contract requirements 0 - 6 months experience

A.1.5 Competencies Understands RF propagation characteristics, Familiar with digital technologies (TDMA, GSA,CDMA), understands RF optimization process, familiar with RF optimization tools available, familiar with CDMA SC product family and all network elements, understands path loss models, link budgets, antenna characteristics, interference sources, frequency planning and able to utilize basic Unix commands, able to follow basic script logic in Unix, Perl or C

A.2

Green Belt (System Engineer)

A.2.1 Pre-Requisites White Belt RF Optimization Achievement Knowledgeable of overall RF Optimization process Knowledge of RF optimization tools and RF analysis techniques Able to utilize and troubleshoot tools without mentoring

A.2.2 Recommended Training The above courses plus one or more of the following courses: SMR120: Motorolas Implementation of IS-95 1 day GNL210: CDMA Network Optimization 2 days GNL220: CDMA Field Performance Optimization 1 day GNL230: CDMA Network Optimization 2 day (SYS300): CDMA system Optimization 5 days SYS610: (MU)Risk Analysis of Network Arch. SYS800: (MU) Network Reliability Objectives SAO010: SC Product Family, System Administration & Operation 10 days SAO200: CBSC Operations Lab 2 days Advanced Scenario Workshop - Internal NDS group CBSC Database Internal NDS group Need Course #:Advanced UNIX Need Course #:Advanced Perl CTD131: Time management 1 day Self-Study Development documentation on new features/parameterization/system impacts Self Study CFC resolution document Self-Study IS-95 specifications

211

Optimization Procedures and Guidelines, Ver 1.2 APPENDIX A Roles and Responsibilities A.2.3 OJT/Experience Participate in medium to large scale RF Optimization activity. Champion one or more of the following activities with Blue Belt engineer Supervision: NetPlan Simulation (review) Contract requirements review Identify optimization exit criteria Evaluate availability of input criteria RF Optimization Project Planning BTS integration Plan metric drive routes Lead drive test team Database and neighbor list review/modification Monitor system performance and benchmarking, utilizing CDLs, PMSUM, and CAT output. System Analysis of Call Processing using COMPAS and/or SMAP Utilize available optimization scripts/tools Identify RF Problem Areas Troubleshoot problems related to feature implementation Help train account team personnel on RF optimization tools/processes Participate in FOA or development integration/testing of new features or special applications (microcell, etc) 6 - 12 months experience A.2.4 Competencies Knowledgeable of all White Belt competencies, able to plan RF optimization events, able to follow script logic in Unix, Perl or C and modify per application if necessary. Understanding of Motorolas implementation of IS-95, able to deploy/optimize new features, understands CBSC database and able to review/modify as necessary, able to troubleshoot RF or feature related problems, able to monitor system and evaluate performance, able to identify new tools which may be used to further aid the RF optimization process. Train account team on RF optimization methodology. Have good written, verbal and time management skills.

212

Optimization Procedures and Guidelines, Ver 1.2 APPENDIX A Roles and Responsibilities

A.3

Blue Belt (System Engineer)

A.3.1 Pre-Requisites Green Belt RF Optimization Achievement Expert in RF optimization Knowledgeable in Motorola Product Line and RF optimization tools Experienced in troubleshooting Willingness to adapt new troubleshooting techniques SAO100 Adv. Perl CTD131: Time Management SYS840: (MU) TMN Architecture & Standards SYS850: (MU) Network Rel. Measurement & Control MGT842: (MU) Project planning, analysis and control Champion small to large scale RF optimization project. Build and manage team of RF optimization engineers throughout system optimization Provide mentoring and training opportunities to Level I and Level II engineers to continuously upgrade their skills Lead in training account team personnel on RF Optimization tools/processes Serve as Customer/Internal management contact Prioritize events to meet contractual requirements and exit criteria Anticipate problems and plan accordingly to prevent them Know when to escalate problems Serve as RF optimization technical focal point Analysis and evaluation of new feature integration 2 - 3 years experience All of the above Level II competencies Capable of championing any of the RF optimization activities required Able to effectively communicate with NDS project management , product management, and development/FOA organizations Very good written, verbal, time management and negotiation skills Very good troubleshooting skills Able to build a team Good conflict management skills Able to write senior management/customer reports

A.3.2 Recommended Training

A.3.3 OJT/Experience

A.3.4 Competencies

213

Optimization Procedures and Guidelines, Ver 1.2 APPENDIX A Roles and Responsibilities

A.3 Black Belt (System Engineer)


A.3.1 Pre-Requisites Blue Belt RF Optimization Achievement Management Training A.3.2 Recommended Training MGT843 : Project Management, Leadership, and Communication Time Management MGT908: Adaptive Management- 2 days Build and manage team of RF optimization engineers throughout system optimization Prioritize events to meet contractual requirements and exit criteria Anticipate problems and plan accordingly to prevent them Know when to escalate problems Serve as RF optimization technical focal point Analysis and evaluation of new feature integration All of the above Blue Belt competencies Capable of championing any of the RF optimization activities required Able to effectively communicate with NDS project management , product management, and development/FOA organizations Very good written, verbal, time management and negotiation skills Very good troubleshooting skills Able to build a team Good conflict management skills Able to write senior management/customer reports

A.3.3 OJT/Experience

A.3.4 Competencies

A.4

Diagnostic Monitor (DM) Operator

A.4.1 Role of the DM Operator The objective of the DM operator is to collect the required data to benchmark the system. This is accomplished by driving specific routes with a Mobile DM in Markov mode and by making M-L and L-M calls. The DM operator can be a technician or an entry level engineer. A.4.2 Responsibilities of the DM Operator Operate the DM Log test calls (location, quality, etc.) Mark maps with dropped calls and poor performance areas Ensure call operation This person will be required to learn how to operate the DM.

214

Optimization Procedures and Guidelines, Ver 1.2 APPENDIX A Roles and Responsibilities

A.5

Landline Operator

A.5.1 Role of the Landline Operator The main role for the Landline Operator is to participate in any calls. This position may also make copies of completed tests and coordinate the upkeep of the Cluster Books. A.5.2 Responsibilities of the Landline Operator Participate in L - M and M - L calls Rate quality of calls Mark maps with locations of drops, poor audio quality, anomalies, etc.

A.6

Driver

A.6.1 Role of the Driver The role of the driver is to safely drive a predetermined route based on general maps for the area and to assist the DM Operator or RF Optimizing Engineer. The driver could be a summer intern, contractor, technician or entry level engineer. A.6.2 Responsibilities of the Driver Review the drive routes prior to the drive to become familiar with the route Assist with coordinating logistics Assist in providing feedback on locations of dropped calls or poor coverage Drive the test routes safely Flexibility to adapt to changing schedules

A.7

Bridge Operator

The role of the Bridge Operator is to be the point of contact between Motorola personnel and any temporary workers. For smaller systems, with a few temporary workers this function is not required.

A.8

CBSC/Switch Engineer

A.8.1 Role of the CBSC/Switch Engineer The primary objective of this engineer is to ensure CBSC performance. A.8.2 Responsibilities of the CBSC/Switch Engineer Pull stats and call detail logs (CDLs) and produce a daily report including: 215

Optimization Procedures and Guidelines, Ver 1.2 APPENDIX A Roles and Responsibilities
Usage Minutes RF Loss/Usage Minutes Origination Access Attempts and Failure Percentages Termination Access Attempts and Failure Percentages D/A HHO Attempts and Failure Percentages Soft HO Attempts and Failure Percentages Source and Target Soft HO Attempts and Failure Percentages Blocking percentage broken down by cell, sector and system

Maintenance of call statistics scripts. System Maintenance parameter display and optimization (changes) Evaluate how any modifications made by the DM team affect the system. Maintain a log book of changes made and why. This will serve as a record of all of the parameters that have been changed if a new software load is loaded with generic values. General Maintenance of the CBSC, AP, OMC, and SMAP Software and hardware upgrades of the above listed platforms. Assist the various team members in troubleshooting issues to be resolved.

A.9

CFE

This team will be needed on a periodic basis for performing the Noise Floor Test, verifying calibration data, ensuring hardware/antenna integrity, and any other BTS verification required by the other teams. Two different levels of field support can be envisioned. The first level of support can be viewed as a BTS Technician whose main responsibility is to reseat boards and perform minor field work. The second level of support capable of calibrating the cell sites (ATP) and being able to perform more sophisticated testing.

A.10

Database Engineer

The role of the Database Engineer is to generate the initial MIB and CLI files to be loaded into the Mobility Managers and Transcoders. This engineer is responsible for recreating these files, if required, when a new CBSC release is sent to the field. This engineer is to support the optimization team if issues relating to commands and parameters arise.

A.11

Development Support

The role of Development Support is as the name implies. Since CDMA is still a fairly new technology, many things are still being learned throughout all aspects of deploying a CDMA system. Therefore, development support will be required for all areas from time to time. Such areas may include, but are not limited to: simulation questions, software issues, hardware issues, BTS, Mobility Manager, transcoder, OMC-R, and data collection tools.

216

Optimization Procedures and Guidelines, Ver 1.2 Appendix B Hardware/Software

Appendix B Hardware/Software
This appendix lists additional Hardware & Software equipment, which is required for data collection. PCMIA card for PC to mobile connection. ELPAC power supply or drive vans 12V terminals, (a mobile cable with a 5-pin DIN is needed for the ELPAC or power inverter. Externally mountable antenna(800 MHz or 1.9 GHz) with mini-UHF male connector Assorted cable. If GPS is desired, the following are also required: 1 Serial port for GPS connection A PS/2 mouse or internal pointing device must be used. PCMICA serial I/O card and adapter cable for system with PCMCIA slots. GPS 9 pin null modem cable (supplied with GPS)

217

Optimization Procedures and Guidelines, Ver 1.2 Appendix B Hardware/Software

Appendix B-1: Check list for Metric Operators


Pre-Departure: C Inside Facility: H E C K Sign In C H E C K

Test Phone Extra Batteries & Charger DM DM cables Clipboards Log Sheets & extras Pencils (min. 2) Route sheets & maps Analog Phone phone list Gas Card/ Money (Driver) GPS unit & Antenna Sign out on board showing all equipment, etc. Equipment Check: Make test call Log onto DM check mode check GPS check hard drive space Check gas level in van, (Driver) Check air pressure in tire, (Driver) Returning to Facility *EQUIPMENT Verify directory structure & set up if needed Remove DMs, cable & phones from van Remove log sheets and equipment from vans *MAINTENANCE Clean out all trash from vans Fill vans with gas.

Put phones on chargers & plug in Compose tally & summary sheets Create file folder for data sheets Information to appropriate People Store equipment File information Clean up any extraneous paperwork Transfer data

218

Optimization Procedures and Guidelines, Ver 1.2 Glossary

Glossary
ACH - Access Channel A/D Analog to Digital AP - Application Processor Bandwidth A relative range of frequencies that can carry a signal without distortion on a transmission medium BBX Broadband Transceiver or Baseband Transceiver BDC Broadband Distribution and Combiner Card BHCA Busy Hour Call Attempts Base Station A radio transceiver that is located near the center of each cell in a cellular telephone network and which communicates with all of the active cellular telephones in the call and provides them with a connection to the switched telephone network. BSC Base Site Controller BSS Base Station System BTS Base Transceiver (Sub)System CAMPS - CDMA Advanced Mobile Phone Simulator CBSC - Centralized Base Station Controller CCP CDMA Channel Processor CDF Configuration Data File or Cell-Site Data File CDL - Call Detail Log CDMA - Code Division Multiple Access CE - Channel Element CEP Channel Element Processor CFC - Call Final Class CFE - Cellular Field Engineer CLI - Command Line Interface CLMF CDMA LMF CM Configuration Management CP - Call Processing CP Call Processing Processor CSM Clock Synchronization Module or Manager CSSS CDMA Static System Simulator D/A - Digital to Analog DAHO - Database Assisted Handoff dB Decibel Delay Spread The time lag due to multipath DM - Diagnostic Monitor DRAM Digital Random Access Memory DSU Digital Service Unit Eb/No - Energy per Bit/Noise ELPA - Expandable LPA EMI Electromagnetic Interference EMX - Electronic Mobile Exchange ERP Effective Radiated Power ESN Electronic Serial Number 219

Optimization Procedures and Guidelines, Ver 1.2 Glossary FDMA Frequency Division Multiple Access FEP - Front End Processor FER - Frame Error Rate or Frame Erasure Rate FM Fault Management FRU Field Replaceable Unit FTP - File Transfer Protocol GCLK Generic Clock GLI - Group Line Interface GPROC Generic Processor GPS - Global Positioning System GSM Global Standard for Mobile GUI Graphical User Interface HATA Propagation Loss Model based on empirical data gathered from cities in Japan HHO - Hard Handoff HLR Home Location Register IM - Intermodulation INS - In Service Io Inter-cell Interference ISDN Integrated Services Digital Network ISI - Inter System Interference IS-41 US Mobile Architecture IS-95 US CDMA Standard IS-136 US TDMA Standard It Total Cell Interference JTC US Joint Technical Committee KSW - Kiloport Switch card LAN Local Area Network LAPD - Link Access Protocol, D channel LFR Low Frequency Receiver or Reference LMF - Local Maintenance Facility LPA - Linear Power Amplifier MAHO - Mobile Assisted Handoff MCAP Motorola Cellular Advanced Processor MCC - Multi-Channel CDMA or Carrier Card MCCCE - MCC Channel Element MGLI - Master Group Line Interface MHz Megahertz (106 Hz) MIB - Management Information Database MIN - Mobile Identification Number MM Mobility Manager MMI - Man Machine Interface MS - Mobile Station MSA Mobile Service Area MSC - Mobile Switching Center MSI - Multiple Serial or Spanline Interface Card MSN Mobile Station Number

220

Optimization Procedures and Guidelines, Ver 1.2 Glossary MTBF - Mean Time Between Failures MTSO - Mobile Telephone Switching Office NID Network ID O&M - Operations and Maintenance OCNS - Other Channel Noise Source OMC - Operations and Maintenance Center OMC-R Operations and Maintenance Center Radio OMC-S Operations and Maintenance Center - Switch OOS - Out Of Service OOS_AUTO - Out of Service, Automatic OOS_MAN - Out of Service, Manual OUNS Other User Noise Source PA - Power Amplifier PAD - Power Attenuator Device PCH - Paging Channel PDC Japanese Personal Digital Cellular PHS Personal Handyphone System PM - Performance Management PN Code - Psuedonoise Code PN Sequence - Psuedonoise Sequence PRM Problem Resolution Matrix PSTN - Public Switched Telephone or Telecommunications Network QCELP Qualcomm Code Excited Linear Predictive RF - Radio Frequency RFDS - Radio Frequency Diagnostic Subsystem RFMF RF Modem Frame RGLI - RFDS Group Line Interface RSSI - Received Signal Strength Indication RX - Receive or Receiver SALT System-Wide Audio Loopback Test SCAP - SC Application Protocol SCH - Sync Channel Sector An RF coverage area segment SHO - Soft Handoff SID System ID SIF - Site Interface Frame SMAP - System Monitoring Application Processor SP Service Processor SQL - Structured Query Language SS7 - Signaling System #7 STRAU - SC Transcoder Rate Adaptation Unit TACS Total Access Communications Systems TCH - Traffic Channel TDMA Time Division Multiple Access TERCKT - Terrestrial Circuit TX - Transmit or Transmitter

221

Optimization Procedures and Guidelines, Ver 1.2 Glossary VAF Voice Activity Factor VLR Visitor Location Register Vocoder - Voice encoder/decoder WLL Wireless Local Loop XASECT - External Analog Sector XC - Transcoder Subsystem or the frame or shelf XCLINK - Transcoder Link XCDR - Transcoder card XCSECT - External CDMA Sector XCVR- Transceiver card

222

Das könnte Ihnen auch gefallen