Quality of Service Development Group (QSDG) meeting Parameters/Thresholds for Quality of Voice

advertisement
Quality of Service Development Group
(QSDG) meeting
(2016 )
Parameters/Thresholds for Quality of Voice
and Data Services (Case of Ghana)
Isaac Annan Laryea-Monitoring and Compliance, NCA-Ghana
isaac.laryea@nca.org.gh
Amsterdam 2016
Outline










QoS Legal Framework
Role of NCA/Monitoring & Compliance
What we Measure
Verification of Compliance Stats
Operations
Tools
Main Indicators Measured
Target for Indicators
Impact (Operators & Consumers)
Way forward for us
Amsterdam 2016
Legal Framework
Electronic Communications Act 2008, Act775
Section 6 subsection 2
The Authority shall specify
(a) quality of service indicators for classes of public
telecommunications service, and
b) the means to enforce a licensee's compliance with
its stated quality of service standards, including
measures by which a licensee shall compensate users
adversely affected by a failure to provide electronic
communications service in accordance
with the standards.
Amsterdam 2016
Role of NCA /M & C in Ghana

QoS assurance through periodic monitoring and evaluation.
 Adopt internationally inter-operative standards
 Establish equipment type approval regulations
and carry out and certify Type approval of equipment
 Sets/Adopts standards and ensures compliance to them
 Establish and maintain a mutually conducive environment for
operators, the public and authorities that promote and
safeguard consumer interests.
 Performs monitoring of performance by the operators and
directs improvements where necessary
 Publishes the industry performance in the media on quarterly
bases for consumer consumption
Amsterdam 2016
What We Measure !!!
In defining Parameters the following factors, among others are generally
taken into consideration:
 The practicability for operators to make the required measurements
 The practicability for regulators or any independent entity to audit
the results
 The measurement being made should retain the customer
experience aspect and influence the satisfaction.
In effect Subjective and objective measurements are adhered!
-Subjective methods by surveying users
-Objective methods by making tests, sampling calls, counting complaints
Amsterdam 2016
Verification of Compliance Statistics
Operators periodically submit statistics on the network
performance and customer support and this is
independently verified.
This is also verified during monitoring; the statistics include
network coverage , outage, consumer complaint reports and
network performance statistics. only customer centric
statistics are monitored in the drive tests.
Operators are also required to report any major network
faults and a database is constantly updated.
Operations
Voice services are evaluated end-to-end, using “voice call” as
the basic test unit- Data collected in all parts but more
concentration in major towns, highways and installations.
Data Measurements are done in stationary vehicle at
selected hotspot with a mobile equipment system.
Tests are performed automatically with no human
intervention on the technology used.
The set up ensures both slave and the master assess the
field. The MO and MT calls are assessed at the master.
Network coverage assessment is made by measuring the
downlink signal levels, RxLev (Received signal Level) for GSM,
CPICH RSCP (Common Pilot Channel Received Signal Code
Power) for WCDMA and 1x/EVDO for CDMA, RSRP for LTE.
(GPRS/EDGE/WCDMA/CDMA1X-EVDO/LTE)
Amsterdam 2016
Tools
The tool is a multichannel benchmarking platform
Provides a direct comparison of multiple networks during a single test drive
Captures quality and radio parameters from actual subscriber devices and Utilizes
standardized algorithms [PESQ and POLQA}
Evaluates the network, end to end, utilizing the devices and services used
by customers to provide QoE
Utilizes multiple devices (phones, modems, scanners) all running in parallel
Uses complex scripting capability to emulate customer activity which helps in postprocessing analysis
Amsterdam 2016
Main indicators Measured
i.
ii.
iii.
iv.
v.
vi.
vii.
Network Coverage – Signal strength of coverage of the radio networks;
Voice Call set up time – period of time that the network takes to establish
the communication, after the correct sending of the request (target telephone
number);
Voice Call Setup Congestion- probability of failure of accessing a signalling
channel during setup;
Voice Call Completion Rate – probability that a call has, after being
successfully set up, to be maintained during a period of time, ending normally,
i.e., according to the user’s will;
Voice Call Congestion Rate- probability of failure of accessing a traffic
channel during call setup;
Voice Call Drop Rate- probability of a call terminating without any of the
users’ will;
Voice Call Audio Quality– perceptibility of the conversation during a call
Amsterdam 2016
Main indicators Measured - Data
i.
Data Access Time – Data Access Time is a measure of the time lapse in activating a
PDP Context for data service. (Moment PDP Request message is sent-Moment PDP
Accept message is received)
Data Access Time s  Time(PDPContextAcceptMar ker)  Time( PDPContext Re questMar ker)
i.
Data Access Success Rate – Date Access Success Rate is the probability of success in
connecting to the public serve
Packet Data Success Rate % 
i.
Data Drop Rate - Data Drop Rate is the probability to drop in connection to public
server without end user’s intervention
Data Drop Rate % 
ii.
Number of successful PDP Context Activation s
 100%
Total number of PDP Context Activation requests
Number of aborted PDP context activation s
 100%
Total number of PDP Context Activation requests
Throughput – Throughput is the rate of data transfer in [kbps]
Targets for Indicators-Voice
VOICE CALL SET UP TIME
Call set up time is the period of time elapsing from the sending of a
complete destination address (target telephone number) to the setting up
of a call.
Call Set up time[s] = t calling signal – t address sending
t address sending – moment when the user presses the send button
t calling signal – moment one hears the call signal on the caller terminal
Compliance requirement is: Call Set-up Time is better than
ten(10)seconds in 95% of the time.
Targets for Indicators cont.
Voice Call Set up Congestion
The probability of failure accessing a signalling channel during setup;
Set Up Congestion [%]= Number of Call Blocked
x 100
Total number of Call Attempts
Compliance requirement is: Stand-alone Dedicated Control Channel (SDCCH) less or
equal to 1%
Voice Call Congestion
The probability of failure of accessing a traffic channel during call setup;
Call Congestion [%]= Number of Calls Failed x 100%
Total number of call attempts
Compliance requirement is: Traffic Channel Congestion less or equal to 1%.
Targets for Indicators cont.
Voice Call Completion Rate
The probability that a call has, after being successfully set up, to be maintained
during a period of time, ending normally, i.e. according to the user’s will;
Call Completion [%]= Number of normally ended calls x 100%
Total number of call attempts
Compliance requirement is: Call Completion greater or equal to 70%.
Voice Call Drop
The probability of a call terminating without any of the users’ will;
Drop Rate[%]= Number of Calls terminated unwillingly x 100%
Total number of call attempts
Compliance requirement is : Call Drop Rate less or equal to 3%
Targets for Indicators -Data
Data Access Time
(Moment PDP Request message is sent-Moment PDP Accept message is received)
Compliance requirement is : Data Access Time should be better than five(5)
seconds in 100% of the time.
Data Access Success Rate
Date Access Success Rate is the probability of success in connecting to the public
server
Compliance requirement is: Data Access Success Rate greater or equal to 95%.
Data Drop Rate is the probability to drop in connection to public server without end
user’s intervention
Compliance requirement is: Data Drop Rate should be equal or less than one
per cent (1%).
Throughput – Minimum Downlink Data Speed Rate of greater than
256kbps in 90% of connections
Greece 2015
Impact
These actions ensure that communications providers
are compliant with their regulatory obligations and
achieve our market objectives and purpose.
Improvement of coverage due to directives to improve
coverage in certain areas.
Helped operators to identify networks defects (CST) on
their network and tasked their vendor to find solutions
Vast improvement in QoS performance making
consumers achieve good value for money.
Amsterdam 2016
Way Forward
Live reporting on web publisher on QoS
performance
Standalone QoS regulations
Technical studies into quality of experience
parameters - Voice/Data
Monitoring the QoS and QoE of data services in
4G networks (such as LTE ...)
Amsterdam 2016
THANK YOU !!!
Amsterdam 2016
Related documents
Download