Analysis of Service Quality in 3G Mobile Networks

advertisement
Analysis of Service Quality
in 3G Mobile Networks
Heidi Lagerström
Supervisor: Professor Heikki Hämmäinen
Instructors: M.Sc. Sami Vesala & M.Sc. Katja Koivu
© Omnitele Ltd. 2005
1
Heidi.lagerstrom@omnitele.fi
Contents
1. Introduction to the study
• Background, research problem, research methods
2. Quality of Service (QoS) in UMTS Networks
3. Measuring service quality
• Defining Key Performance Indicators (KPI)
4. Case study
© Omnitele Ltd. 2005
2
Heidi.lagerstrom@omnitele.fi
Background
• UMTS introduces new real time services to mobile networks,
such as video telephony.
• These real time services require QoS guarantees to function
properly.
• For operators to maintain satisfactory service quality constant
network monitoring is needed.
• Network measurements are based on correctly defined KPIs for
each service.
Operators’ possibilities to utilise QoS in
practice have not been widely
researched.
Key Performance Indicators have not
been defined for the new services from
the end-user perspective.
© Omnitele Ltd. 2005
3
Heidi.lagerstrom@omnitele.fi
Research problem
How should service quality be measured in 3G networks
and how the QoS mechanisms can be used to affect the
service quality perceived by subscribers.
Objectives:
1. What are the KPIs that measure service quality, from end user
perspective, in 3G networks for the key services (AMR voice,
video telephony, video streaming, web browsing and e-mail)?
2. What are the QoS mechanisms in Release 99 and how can
they be used to improve service quality?
© Omnitele Ltd. 2005
4
Heidi.lagerstrom@omnitele.fi
Research methods
• Literature study
– 3GPP, ETSI, ITU specifications
– Several books and publications
• Interviews
– Network equipment vendors: Ericsson, Nokia
– Operators: Elisa
– Several other radio network experts
• Case study
– Field measurements for two
operators in live networks
© Omnitele Ltd. 2005
5
Heidi.lagerstrom@omnitele.fi
Contents
1. Introduction to the study
• Background, research problem, research methods
2. Quality of Service (QoS) in UMTS Networks
3. Measuring service quality
• Defining Key Performance Indicators (KPI)
4. Case study
© Omnitele Ltd. 2005
6
Heidi.lagerstrom@omnitele.fi
Why do we need QoS?
•
•
•
•
UMTS networks support services with very different performance
requirements
– Real-time services require performance guarantees
– Customer acceptance closely tied to service quality
Optimal usage of network resources
– Radio resources scarce
– Cost-effectiveness
– Return of investment
Service and user differentiation
– Meet different needs of customers
(e.g. business vs. consumer)
Performance Requirements
– Support different services
Sensitivity
Application
Bandwidth
(real-time vs. best effort)
Delay
Jitter
Video call
High
High
High
Competitive advantage!
© Omnitele Ltd. 2005
7
Loss
Med
Streaming
High
Med
Med
Med
Web browsing
Med
Med
Low
High
E-mail
Low
Low
Low
High
Heidi.lagerstrom@omnitele.fi
QoS Traffic Classes
Traffic class
Characteristics
Example
application
Conversational
Preserve time relation between
information entities of the stream.
Conversational pattern (stringent
and low delay)
Speech
Video calls
Streaming
Preserve time relation between
information entities of the stream.
Real-time
streaming
video
Interactive
Request-response pattern.
Preserve payload content.
Web
browsing
Tolerant
• Delay and bit rate
can vary
• Integrity
Background
Destination is not expecting the
data within a certain time.
Preserve payload content.
E-mail
File
downloading
Easiest
• Delay and bit rate
can vary
• Integrity
© Omnitele Ltd. 2005
8
Heidi.lagerstrom@omnitele.fi
Demanding
• Delay
• Jitter
Demanding
• Bit rate
• Jitter
QoS Profile Attributes
R99 QoS attribute
Example value
Residual BER
10 –5
SDU error ratio
10 –4
Delivery of erraneous SDUs
No
Maximum SDU size (octets)
1500
Delivery order
No
Transfer delay
100 ms (conversational)
280 ms (streaming)
ARP
1, 2 or 3
Traffic Class
Conversational,
streaming, interactive,
background
Depends on
operator’s
QoS strategy
THP
1, 2 or 3 (same as ARP)
Maximum allowed bit rate
e.g. 64, 128 or 384 kbps
Maximum guaranteed bit rate
e.g. 64, 128 or 384 kbps
Depends on
the QoS strategy
and UE/RNC
capabilities
© Omnitele Ltd. 2005
9
Heidi.lagerstrom@omnitele.fi
QoS Differentiation
Video
telephony
Conversational RAB
Streaming
Streaming RAB
Push-to-talk
Interactive RAB, THP/ARP = 1
Interactive RAB, THP/ARP = 3
Web
browsing
Background RAB
MMS
• Each service gets the treatment it requires according to the QoS profile
• Network resources are shared according to the service needs
• Network resources can be used more efficiently
© Omnitele Ltd. 2005
10
Heidi.lagerstrom@omnitele.fi
QoS Mechanisms
PDP context with the requested QoS capabilities
ATM QoS
RRM
UE
Node B
RNC
Iu
Gn
Gn
3G-SGSN
Inter-PLMN
Backbone
NT
UTRAN
PS Domain
Different
channel
types
DiffServ on
transport level IP
(ATM QoS for CS)
Firewall
Diffserv on
transport
level IP
• Different QoS techniques in
different parts of the network
• Appropriate QoS must be provided in every
network so that the user can experience
good service quality
© Omnitele Ltd. 2005
3G-GGSN
11
Heidi.lagerstrom@omnitele.fi
External IP
(Internet)
Diffserv in Gi
IP
TE
Operators’ QoS Strategy
Application
server
Node B
RNC
3G
GGSN
3G
SGSN
Conversational RAB
Streaming RAB
Interactive RAB, THP/ARP = 1
Interactive RAB, THP/ARP = 3
Background RAB
HLR
• Operators can practise user differentiation by
giving each user set of QoS profiles, which
he/she is entitled to use
• Operators can practise service differentiation
by mapping each service to the bearer that
meets its requirements
© Omnitele Ltd. 2005
12
Heidi.lagerstrom@omnitele.fi
User profiles are stored in HLR.
Each user can have several user
profiles, which correspond to
different services and are mapped
to different bearers according to the
operator’s strategy.
• Meet the needs of different
customers
• Offer each service the quality
it requires
• Optimise network resource
usage
Contents
1. Introduction to the study
• Background, research problem, research methods
2. Quality of Service (QoS) in UMTS Networks
3. Measuring service quality
• Defining Key Performance Indicators (KPI)
4. Case study
© Omnitele Ltd. 2005
13
Heidi.lagerstrom@omnitele.fi
Measuring network performance
Customer
feedback
Network Performance Monitoring
 Optimisation
Network statistics from different counters and interfaces
E2E
service
quality,
QoE
Application
server
Node B
RNC
UTRAN
© Omnitele Ltd. 2005
14
Performance
statistics from
application server
3G
GGSN
3G
SGSN
Core nw
Heidi.lagerstrom@omnitele.fi
External nw
Defining the right KPIs
•
Different services have different quality requirements
– KPIs must be defined separately for each of the key services
•
KPI categories
– Service Accessibility
– Service Integrity
– Service Retainability
•
With inadequate performance indicators and monitoring
– Hidden problems in network performance and user perceived
quality of service
– Poorly defined indicators may show better quality than in the reality
• Incorrect formulas and counters
• Unreasonable measurement periods
(too much averaging etc.)
© Omnitele Ltd. 2005
15
Heidi.lagerstrom@omnitele.fi
Example: Voice Services - CS
Customer
demand
Indicator
Measure
Service
accessability
Availability & Coverage
Call setup success rate
Call setup delay
Ec/No, RSCP
Admission control
RAB assignment
Service integrity Voice quality
Noisy frames (FER),
MOS
Service
retainability
Handover failure
No coverage
Interference
© Omnitele Ltd. 2005
16
Call drop rate
Heidi.lagerstrom@omnitele.fi
Example: Data Services - PS
Customer demand
Indicators
Measures
Service
accessability
Availability & Coverage
Access success rate
Service access delay
Ec/No, RSCP
Admission control
Attach, PDP context
activation, IP service setup
Service integrity
Video quality
Audio quality
Web page download time
E-mail sending time, etc.
BLER, FER, throughput,
delay, jitter
Service
retainability
Dropped data connection
Connection timeouts
Dropped PDP context/attach
No coverage etc.
Handover failure
© Omnitele Ltd. 2005
17
Heidi.lagerstrom@omnitele.fi
Contents
1. Introduction to the study
• Background, research problem, research methods
2. Quality of Service (QoS) in UMTS Networks
3. Measuring service quality
• Defining Key Performance Indicators (KPI)
4. Case study
© Omnitele Ltd. 2005
18
Heidi.lagerstrom@omnitele.fi
Measurement plan
Operator 1
Operator 2
Drive test: AMR speech
X
X
Drive test: FTP download
X
X
Video telephony
X
N/A
Streaming
X
X
Web page download
X
X
E-mail
X
X
Data connection: attach, PDP
context activation, RTT, FTP DL &
UL
X
X
Tools: Nemo Outdoor, Optimi x-AppMonitor, Ethereal
© Omnitele Ltd. 2005
19
Heidi.lagerstrom@omnitele.fi
AMR voice – Drive test statistics
Operator 1
4.649
70
100
Operator 2
2.494
100
100
Soft handovers per call
Soft handover interval (s) ave.
Soft handover success rate (%)
10.33
7.901
100
10.38
7.953
100
Best active Ec/N0 (dB) ave.
Best active RSCP (dBm) ave.
Tx Power (dBm) ave.
-4.03
-79.3
-15.5
-3.97
-66.9
-29.5
BLER
Pilot BER
0.254
1.917
0.221
2.363
Call setup time (s)
Call setup success rate (%)
Call completion rate (%)
Reasons for call failure:
• Ec/N0 was not at adequate level
• Call setup was unsuccessful (unsuccessful RACH procedure)
• Look at L3 signalling
© Omnitele Ltd. 2005
20
Heidi.lagerstrom@omnitele.fi
Shoud be
~100%
Good >
-10dB
Good >
-92dBm
Good <
21dBm
CPICH coverage – Ec/N0
Operator 1
According to Ec/N0 values
both operators have good
coverage. Couple of RED
areas, which need to be
further investigated!
Operator 2
If large interference areas are
generated, the problem could be
minimised later by adjusting the
antenna direction or height, or by
down tilting the antenna or by
slightly tuning the pilot power
levels.
© Omnitele Ltd. 2005
21
Heidi.lagerstrom@omnitele.fi
Data Connection
Typical RTT in UMTS
network is ~200ms,
which enables good
quality conversational
PS services, such as
VoIP.
© Omnitele Ltd. 2005
22
Heidi.lagerstrom@omnitele.fi
Video streaming
In mobile phone display ~60 kbps streaming bit rate produces
good video quality.
© Omnitele Ltd. 2005
23
Heidi.lagerstrom@omnitele.fi
Web browsing
Service access time
Web page download time
0.26
22.79
25
0.26
16.59
Time (s)
Time (s)
20
0.255
0.25
0.25
15
10
5
0.245
Operator 1
0
Operator 2
Operator 1
Operator 2
Throughput
Throughput (kbps)
200
150
Instantaneous
100
Average
50
0
1
2
3
4
5
6
7
8
Operator 1 Operator 2
Service accessibility (%)
100 %
100 %
Service access time (s)
0.25
0.26
Web page download time (s)
22.79
16.59
Service retainability (%)
100 %
100 %
9 10 11 12
Time
© Omnitele Ltd. 2005
24
Sample web page 319 kB
Heidi.lagerstrom@omnitele.fi
Conclusions
•
In 3G networks QoS management is required
– Real-time services require QoS guarantees
– Need to support different kinds of services
– With QoS mechanisms operators can use their network resources more
efficiently and gain competitive advantage
•
To maintain and improve the network performance and user experienced
service quality constant monitoring and performance follow-up is needed
– Successful network measurements are based on correct KPI definitions
– A combination of end-to-end field measurements, interface probes, network
element counter statistics and customer feedback is required
•
The measurement results show that there are big differences in the
performance of operators’ UMTS networks
– Currently UMTS networks are not fully optimised  there is a clear need for
optimisation!
– Majority of 3G measuring equipment and terminals are still quite immature
© Omnitele Ltd. 2005
25
Heidi.lagerstrom@omnitele.fi
For more information about Omnitele,
please visit our web site
www.omnitele.fi
2004
© Omnitele Ltd. 2005
26
KPI Definitions
2004
© Omnitele Ltd. 2005
27
AMR Speech KPIs
Parameters
Service coverage
Speech quality
Service accessibility
Codec usage
Service access time
Trigger
points
Service retainability
Place a call
Channel request
28
Speech interchange
T2
T1
T0
© Omnitele Ltd. 2005
Alerting message
ALERTING
Start of audio stream
Heidi.lagerstrom@omnitele.fi
Intentional
termination of session
T3
RELEASE
Video Telephony KPIs
Speech quality
Video quality
Service coverage
Video call setup
success ratio
Parameters
Service accessibility
Service access time
Video call
setup time
Video call
Request
Alerting message
Trigger
points
Channel request
© Omnitele Ltd. 2005
29
Service retainability
Audio/video output
T2
T1
T0
Audio/video
synchronisation
ALERTING / Call accepted
Audio/video output
starts
Heidi.lagerstrom@omnitele.fi
Intentional
termination of session
T3
Audio/video
output ends
RELEASE
Video Streaming KPIs
Video quality
Service coverage
Parameters
Service accessibility
Stream
Request
RTSP: SETUP
© Omnitele Ltd. 2005
Buffering message
appears on player
30
Streaming reproduction cut-off ratio
Stream reproduction
T2
T1
T0
Audio/video
synchronisation
Streaming
reproduction
start delay
Service access time
Trigger
points
Audio quality
Streaming
reproduction
start failure
RTP: payload
1st data packet
BUFFERING
Streaming reproduction
starts – picture appears
PLAY
Heidi.lagerstrom@omnitele.fi
Intentional
termination of session
T3
Video/audio stream
ends
RTSP
TEARDOWN
Web Browsing KPIs
Service coverage
Parameters
Web page
download time
Service accessibility
Service access time
Trigger
points
© Omnitele Ltd. 2005
31
Service retainability
Service access
Data transfer
T0
T1
1st TCP [SYN]
1st HTTP: GET
Heidi.lagerstrom@omnitele.fi
Intentional
termination of session
T2
HTTP: FIN/ACK
Reception of last data packet
Display data
T3
E-mail KPIs
Service coverage
Parameters
Service accessibility
Sending time
Service access time
Trigger
points
1st TCP [SYN]
© Omnitele Ltd. 2005
Service retainability
Service access
T0
32
Receiving time
E-mail sending
T1
SMTP: 250
ACK (HELO)
E-mail download
T2
T3
Last data
IMAP: FETCH Body
packet send
TCP [FIN/ACK]
Heidi.lagerstrom@omnitele.fi
T4
Last data packet
received
TCP [FIN/ACK]
Download