Beruflich Dokumente
Kultur Dokumente
RN2022-13N
RANOP – Radio Network Optimization Principles
Assessment
RN20222EN13GLN0 1
Assessment
Objectives
RN20222EN13GLN0 2
Assessment
Contents of RANOP
• Introduction to optimization
– What is network optimization
– What should be taken into account when starting Network
Optimization?
•Assessment
– Situation at the moment
• KPIs and Measurements
– Measurement tables + KPIs
• Solution findings (optimization) and verification
– Maximum gain in limited time
– Bottlenecks
• Features to be considered
– NSN recommended features to be used in optimization
RN20222EN13GLN0 3
Assessment
• Configuration Assessment
– Introduction / dependency table example
– Area assessment
– Network assessment
• Parameter Assessment
– Parameter values
– Default and specific values
– Parameter checks (default and specific parameters)
– Consistency checks & delta values
– Frequencies (planning tool vs. real network values)
– Parameter discrepancies
– Feature assessment
• Performance Assessment
– Benchmarking KPIs used
– Performance Data analysis
– Multi vendor KPI
– Field Tests
– Alarms
• Tools to be used
RN20222EN13GLN0 4
Assessment
Assessment - Introduction
Solution
Implementations Work orders Priorization
Verification
Project
report
Project
Ends
Solution
Verification Assessment
Implementations Work orders Priorization
Configuration assessment
Are Criteria’s
No Parameter assessment
Monitoring /
Fulfilled Analysis Solutions
Performance assessment
Data collection
Yes Project
report
Project
Ends
5 © Nokia Siemens Networks RN20222EN13GLN0
RN20222EN13GLN0 5
Assessment
Configuration Assessment
RN20222EN13GLN0 6
Assessment
Project
report
Project
Ends
RN20222EN13GLN0 7
Assessment
Configuration Assessment –
Solution
Implementations Work orders Priorization
Verification
Project
Ends
RN20222EN13GLN0 8
Assessment
Configuration Assessment –
Solution
Implementations Work orders Priorization
Verification
Project
Ends
RN20222EN13GLN0 9
Assessment
Project
report
Project
Ends
• Situation in general
• Area size
• Antenna types …
RN20222EN13GLN0 10
Assessment
Assessment
Project
report
Project
Ends
• BTS HW Assessments
• Which HW versions are in use in certain site
• BSC
• BSC name and type
• BSC Utilization and max capacity
RN20222EN13GLN0 11
Assessment
Parameter assessment
RN20222EN13GLN0 12
Assessment
Project
report
Project
Ends
RN20222EN13GLN0 13
Assessment
Project
report
Project
Ends
RN20222EN13GLN0 14
Assessment
parameter values
Project
report
Project
Ends
RN20222EN13GLN0 15
Assessment
Project
report
Project
Ends
• Export Actual Default Parameters from the OSS through a xml file, for e.g.
• Use always latest default values (based on latest studies)
• Import xml file into a Database using for e.g. CM Plan Editor.
• Compare Planned Default Sets with Actual Default Parameters in the
network.
• Mark exceptions for each default parameter
• Show differences
• Select changes to be done
• Create dat file or xml file for correction
RN20222EN13GLN0 16
Assessment
Project
report
Project
Ends
Specific
Parameter area
Specific
Parameter area
Specific Specific
Parameter area Parameter area
Specific
Parameter area
RN20222EN13GLN0 17
Assessment
Project
report
Project
Ends
actual specific
CMDatabase
Database actual default Consistency
CM
check
Imp default
o rt rt
Expo
Parameter
planned
file
Consistenc
import y check
Parameter
file
Mapinfo export
NetAct Planner
ODBC Planning
database
RN20222EN13GLN0 18
Assessment
Project
report
Project
Ends
RN20222EN13GLN0 19
Assessment
Project
report
Project
Ends
100,99 100,99
96,92 96,92
112,88 115,88
78,72 78,72
75,115 75,115
Same channels (115)
1 TRX should => bad interference!!
81,110 be added 81,110
20 © Nokia Siemens Networks RN20222EN13GLN0
RN20222EN13GLN0 20
Assessment
Project
Project
Ends
Cell A Cell B
RN20222EN13GLN0 21
Assessment
Solution
Implementations Work orders Priorization
Verification
Project
Ends
TRX
N-Neighbour
ADCE
fAB BSC id BTS id N-LAC N-Cell id N-BCCH
RN20222EN13GLN0 22
Assessment
Assessment
Project
report
Project
Ends
RN20222EN13GLN0 23
Assessment
Performance Assessment
RN20222EN13GLN0 24
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Introduction
Project
report
Project
Ends
• Challenges
• Missing data
– If some hourly data is missing – difficult to find out without hourly
analysis
– All tables are not activated
– All tables can not be found from OSS
RN20222EN13GLN0 25
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Project
Ends
• Bencmarking KPIs
– Is recommended for reporting
– Release specific
• Customer KPIs
– Almost all operators have operator specific KPIs.
– Important KPIs because optimization is done for customer
• Limitations
– SW Release limitation
There might be errors in results if wrong KPIs will be used
– Feature limitations
There might be errors in results if wrong KPIs will be used
For example if HR is not used
RN20222EN13GLN0 26
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Project
Ends
RN20222EN13GLN0 27
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Statistics
Project
report
Project
Ends
Strong points +
• Allows centralized data collection
• A cost efficient way to monitor network quality
• Pro-Active
• Selective performance monitoring (e.g. after new feature activation)
• Permanent information flow
• Useful to monitor trends
Weak points -
• Needs statistically relevant traffic volume to provide reliable results
• Limited geographical location of problems is possible
• Can be difficult to understand the meaning of the counters
RN20222EN13GLN0 28
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Benchmarking KPIs - CS
Project
report
Project
Ends
OSS counter and CS KPI analysis gives exact picture about network
performance. The analysis of CS KPIs can be based on the following list:
• Accessibility
– SDCCH access KPIs S13 EDGE, GPRS, and
– TCH, SDCCH Blocking KPIs GSM KPIs can be seen
– HO failure due to blocking here
• Retainability
– TCH, SDCCH drop KPIs
– Total HO failure, HO drops
• Quality
– UL/DL Quality KPIs KPIs
• Traffic share
– TCH, SDCCH Traffic sum
– AMR traffic share
RN20222EN13GLN0 29
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Benchmarking KPIs - PS
Project
report
Project
Ends
OSS counter and PS KPI analysis gives exact picture about network performance. The analysis
of PS KPIs can be based on the following list:
• Network Usage
– UL/DL PS Traffic
– UL /DL Payloads
• PDTCH Congestion
– DL hard/soft blocking
• Abis/PCU Congestion
– Inadequate EDAP resources in DL
– DL MCS selection limited by PCU
• PDTCH Quality
– UL/DL (E)GPRS RLC throughput
– TBD Success ratio
• Mobility
– Downlink flush per minute
• Availability
– Data service availability ratio
• User Experience
– LLC Throughput
RN20222EN13GLN0 30
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Benchmarking KPIs - PS
Project
report
Project
Ends
• QoS
– TBF establishment/success ratio
• Connectivity capacity
– DL MCS selection KPIs
• Mobility
– DL Flush
• E2E data rate
– LLC Throughput
– Volume weighted LLC throughput
RN20222EN13GLN0 31
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Customer Complaints
Project
report
Project
Ends
Strong points +
• Good co-operation between network optimization and customer service department is
necessary
• Helps to find not known problems
• Helps to confirm known problems
• Might pinpoint location of problem
• Reports indoor coverage holes
• May trigger fast decision from management in case heavy users are involved
• Problems with specific mobiles
• Problems of mobile configuration
Weak points -
• Reacting to customer complaints is reacting too late
• Not efficient for optimizing a whole network
• Difficult to distinguish between mss and bss problems
E.g.: Errors in the numbering plan can only be detected from Customer Complaints!
RN20222EN13GLN0 32
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Project
Project
Ends
Strong points +
• Performance can be compared by using statistics
Weak points -
• Counter trigger points might be different
• Calculation methods might be different
– Are we calculating fails or (all attempts - successful
attempts)
RN20222EN13GLN0 33
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Project
Field Tests
report
Project
Ends
Outdoor antenna
Network 1
RN20222EN13GLN0 34
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Project
report
Field Tests
Project
Ends
Strong points +
• Helps to adjust propagation models
• Troubleshooting
• Detects problems that can be masked by statistics
• Graphical reports on maps that allow to easily identify “hot areas”
• Monitoring and validation of new functionalities or changes done in the network
• Identify coverage problems, meaning both coverage holes and cells covering less or more than
wanted. Used in antenna tilting optimisation.
• Finding interference (difficult to measure) and bad quality.
• Detecting adjacency problems:
– Missing neighbours
– Unnecessary or unwanted neighbours
– Incorrectly defined adjacencies.
• Finding hardware problems:
– Crossed sectors.
– Mixed antenna lines.
– Faulty units (TRXs, BBUs, etc).
– Imbalance problems (e.g., due to ROE in cables or jumpers).
RN20222EN13GLN0 35
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Project
Field Tests
report
Project
Ends
Weak points -
• Very resource and time consuming, making it expensive to monitor wide
areas and sometimes restricts DTs to specific areas
• QoS monitoring from moving subscriber point of view. Gives a feeling of
customer view, although slow moving MS, indoor MS, high floor MS are not
taken into consideration
• Supplies in most cases only downlink information
• Snap-shot in time
• Statistics can be influenced by the velocity of the driver in good areas and in
bad areas.
RN20222EN13GLN0 36
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Project
Project
Ends
• CS
– Coverage
– Quality - Interference
– HO measurements
– CSSR with short call duration
– Traffic handling measurements
• PS
– PSW accessibility analysis
• GPRS attach / PDP content activation / TBF establishment
– Throughput analysis – Stationary
• Average throughput (RLC/MAC and Application)
• MCS distribution
• BLER ,C/I ratio
– Throughput analysis – Mobility (intra/inter PCU and RAU cell-reselection)
• LLC, RLC/MAC
37 © Nokia Siemens Networks
• Retransmission
RN20222EN13GLN0
based on cell re-selection
RN20222EN13GLN0 37
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Project
Optimizer
report
Project
Ends
RN20222EN13GLN0 38
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Project
Project
Ends
RN20222EN13GLN0 39
Assessment
Performance Assessment
Solution
Implementations Work orders Priorization
Verification
Project
Alarms
report
Project
Ends
If there are lots of active alarms, collected data is not valid
• Performance data is not reliable
• Optimization targets are based on wrong inputs
RN20222EN13GLN0 40
Assessment
Tools to be used
RN20222EN13GLN0 41
Assessment
Tools to be used
Solution
Implementations Work orders Priorization
Verification
Project
report
Project
Ends
RN20222EN13GLN0 42