Uploaded by نزار عبدالله

Tahitii Obioha Presentation

advertisement
ITU Workshop on Performance,
QoS and QoE for Multimedia Services
Perspectives on QoS Evaluation and
Benchmarking
Tahitii OBIOHA
Radio Network Performance Engineer
SG12RG-AFR
Kigali, Rwanda
4-5 March 2019
QoS Questions
Frequently Asked Questions of a Telecom Regulator faced with QoS challenges.
A. Why should a Regulator even evaluate the QoS of mobile operators?
B. What are the methodologies recommended by ITU/ETSI for QoS evaluation and which
KPIs (High-Level) should be monitored?
C. What methodology takes the Quality of Experience of users into account?
D. What comes next after QoS Evaluation?
E. What formula should a regulator with NMS, use for benchmarking purposes for MNOs
given that vendors used by MNOs have different counters names and specific formulas?
F. What is the minimum recommended frequency of QoS Audit and Benchmarking
Reports for Mobile Network Operators?
G. What are the examples of QoS Monitoring/testing tools for Telecom Regulators?
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
2
QoS Answers
QoS Challenges have been addressed and solutions are given in recommendations
as seen in ITU-T E.800 Sup 9, ITU-T E.811 and
ETSI EG 202 057-3.
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
3
Gentle Reminder
Nothing is Possible Without The Network
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
4
Relationship between NP, QoS and QoE
• Access Network + Core Network
Highway
• Terminal Equipment
Vehicule/Truck
• QoE depends majorly on QoS which in turn depends on Network
Performance(NP) thus NP parameters ultimately determine the QoS.
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
5
QoS Perspective for Regulators
• Congestion
Highway Traffic
• Given the rapid growth of mobile services, Regulators are advised to monitor QoS
also from a network performance point in terms of DELIVERY and resource
availability rather than service level alone.
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
6
A. Objectives of QoS Evaluation(for Regulators)
➢
Ensure Consumer satisfaction by making known the quality of service, which the service
provider is required to provide, and the user has a right to expect, enabling consumers make
informed choices among several service providers.
➢
Assess Operators Performance level by benchmarking their performance against standards
and criteria imposed by country’s Regulatory Authority.
➢
Level the playing field for mobile operators to compete on their own merits and not on
alliances and sheer size.
➢
Generally protect the interests of consumers of telecommunication services by putting a
check on service degradations and outages through periodic QoS reports published on a
corporate website.
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
7
B. Methodologies/QoS Parameters as advised by ETSI EG 202 057-3
Different and Complementary Approaches to Mobile QoS
QoS Evaluation of any PLMN irrespective of the
RAT -2G, 3G, 4G,or 5G should be based on these
QoS Categories (NA, SA, SR and/or SI) using
High level KPIs.
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
8
B. Continuous/Proactive Monitoring (Best approach)
Different and Complementary Approaches to Mobile QoS
• Stationary/Walk/ Drive Test
• OMC-R counter measurement using Network Management System (NMS)
QoS Assesment Target
Best Suitable QoS
Approach(es)
Player concerned
Network coverage
DT
OPERATOR/REGULATOR
Acceptance Procedure
DT or NMS
OPERATOR
Proactive Monitoring
NMS
OPERATOR/ REGULATOR
Optimisation Cycle
DT + NMS
OPERATOR
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
9
C. Which Approach takes account of QoE
NMS
Answer : NMS because real traffic is used for evaluation.
DT
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
10
C. Why not Crowdsourcing and CDR Analysis for QoE
Analysis of Crowdsourcing :
➢
Crowdsourcing only looks at smart phones and feature phones leaving out the
majority of the subscribers especially in Africa where the Smart phone penetration
rate’s all time high is less than 50% .
➢
KPIs such as Congestion and Cell_Unavailability cannot be measured using
Crowdsourcing.
➢
There is a lot of Data Cleansing in crowdsourcing analysis which is yet to be
standardised
Analysis of CDR:
➢
Benefits of CDR analysis include the following but not for QoS/QoE
➢
Disaster Response, Health, Socio Economics, Trasportation, SimBox Fraud
Detection, Telephone Use Patterns, Customer Profiling and Sales Forecasting.
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
11
C. Why not Crowdsourcing and CDR Analysis for QoE
Analysis of CDR:
➢
KPIs such as Congestion and Cell_Unavailability cannot be measured using CDR
analysis.
➢
QoS DELIVERED cannot be evaluated using this methodology , also There are
Privacy Concerns, Data Discontinuity, Accuracy, and the File structure (Meta Data)
and contents tailored in such that it is best suited for Billing Verification and Traffic
Management.
Information (Meta data) (CDRs)
➢
the phone number of the subscriber originating the call (calling party, A-party)
➢
the phone number receiving the call (called party, B-party) etc.
Information (Meta data) 2G (PM files)
➢
the serving cell
➢
the TCH attempts, the Cell unavailability period etc
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
12
Best Approach besides both being complimentary
•
We recommend both, but if you should have only one tool, it should be an NMS, this is buttressed
by the most recent recommendation on QoS of Major Events : ITU-T E.811(03/2017)
High Level KPIs (-/+)
QoS Evaluation Categories (NA, SA, SR and SI)
QoS Voice SRQoS Voice SAQoS Data SA QoS Data SR QoS Data SI +
QoS Data NA +
QoS Voice SAQoS Data SA-
77% of the KPIs monitored during any major event in any country should be realised using an NMS
High Level KPIs refer to 3GPP TS 32.410 (2G & 3G) and 3GPP TS 32.450 (4G).
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
13
High Level KPIs (+/-) for QoS Audit using NMS, Per RAT et QoS Evaluation Category
•
2G (Voice Service only)
3G (Voice and Data Services only)
-
+
+
+
+
+
+
-
+
+
+
-
+
+
+
4G (Data Service only)
+
+
+
+
+
For QoS Audit and
compliance, TRAs are
advised to choose at least
one KPI (+ or -) per QoS
Evaluation Category.
+
+ The higher the value of the KPI the better
- The lower the value of the KPI the better
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
14
D. Next Course of Action after QoS Evaluation
Is COMPLIANCE (ITU-T E.800 Supplement 9)
Regulators should as the name indicates, adopt the regulation oriented approach where fines are
paid to regulators per cell (i.e. faulty network element affecting the underserved area with
unhappy end users.)
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
15
E. KPI Formula: standardization (CO-OP initiative)
DT TOOL
QoS Audit &Benchmark
NMS
Number of TCH drops after assignment
Call Drop Rate = --------------------------------------------Total number of TCH assignments.
CO-OP
nbrOfLostRadioLinksTCH +
unsuccInternalHDOsIntraCell +
unsuccHDOsWith Reconnection +
unsuccHDOsWithLossOfConnection
CallDropRate =
succTCHSeizures + succInternalHDOsIntraCell
QoS Benchmark
+succIncomingInternalInterCellHDOs
ALCATEL
NMS
QoS AUDIT
MC14c-Nbr of TCH (in HR or FR usage) drops in TCH established phase due to BSS problem +
MC739-Nbr of TCH (in HR or FR usage) drops in TCH established phase due to TRX failure +
MC736-Nbr of TCH (in HR or FR usage) drops in TCH established phase due to radio link failure +
MC621- Nbr of TCH drops during the execution of any TCH outgoing handover (Inter cell, Intra cell) +
MC921c-Number of pre-empted calls in the cell
CallDropRate =
MC718-Nbr of TCH (in HR or FR usage) normal assignment successes +
MC717a-Nbr of incoming directed retry (towards a TCH channel in HR or FR usage) successes +
MC717b-Nbr of incoming internal and external TCH (in HR or FR usage) handover successes per TRX MC712-Nbr of outgoing TCH handover successes, per TRX. Intracell, internal intercell and external handovers
The non standardization of KPIs across equipment vendors makes it difficult for operators with
multiple vendors to easily calculate network wide KPIs.
Solution is the CO-OP initiative formula for High Level KPI.(3GPP TR 32.814)
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
16
F. Frequency of QoS Audit and Benchmarking Reports
➢
QoS Audit reports should be monthly for countries whose users experience relatively
poor QoS while quarterly for others.
➢
QoS Benchmarking reports should be quarterly for such countries with poor QoS
delivered and six-monthly for others.
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
17
G. Tools for QoS Evaluation/Benchmarking
Example of NMS tool for Regulators with in built Compliance Mechanism, default
High level CO-OP KPIs and more
– RPM system by PNI
Example of DT tool for Regulators with Customer experience application based
monitoring capability and more
– Nemo Wireless Network Solutions by MidWex
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
18
QoS Evaluation Overview
NP
QoS
Objective
Active
Non-NP
Subjective
Passive
Testing
Intrusive
Walk/Drive
Around test
Monitoring
NonIntrusive
Consumer Surveys,
Complaints, etc…
OMC-R
Counters using
NMS
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
19
Conclusion/Recommendations
▪
Today, QoS Evaluation and Benchmarking on a Network level using DT Tool alone is just
incomplete and DT results on a network level are not representative at all owing to sampling size
and timing of acquisition.
▪
Regulators need to add an NMS to the QoS portfolio suite in order to assess the most accurate
and complete vision of the value offered by the MNOs to end-users.
▪
The Trend and widely adopted methodology is the use of NMS to process Performance
management( PM) files for monthly QoS Audit and leveraging CO-OP KPI formula for quarterly
QoS Benchmark reports.
▪
Regulators should put to practice these contribution-driven recommendations of ITU-T to
achieve desired Country QoS objectives.
© Planet Network International – ITU Workshop on Performance, QoS and QoE for Multimedia Services
20
THANK YOU
FOR YOUR
ATTENTION
For more information or guidance on
QoS Monitoring Solutions & QoS Strategies
email: info@planetworkint.com
SG12RG-AFR
Download