Presentation

advertisement
SABRE: A client based technique for
mitigating the buffer bloat effect of
adaptive video flows
Ahmed Mansy, Mostafa Ammar
(Georgia Tech)
Bill Ver Steeg
(Cisco)
1
What is buffer bloat?
Significantly high queuing delays from TCP & large buffers
Bottleneck = C bps
Client
Server
RTT
• TCP sender tries to fill the pipe by increasing
the sender window (cwnd)
Ideally, cwnd should grow to BDP = C x RTT
•
• TCP uses packet loss to detect congestion,
and then it reduces its rate
Large buffers increase queuing delays and also
delays loss events
2
DASH: Dynamic Adaptive Streaming
over HTTP
Manifest
DASH client
HTTP server
350kbps
600kbps
900kbps
Video is split into short segments
Download rate
1200kbps
Buffer 100%
Time
S. Akhshabi et al, “An experimental evaluation of rate-adaptation
algorithms in adaptive streaming over HTTP”, MMSys’ 11
Initial buffering
phase
Steady state
(On/Off)
3
Problem description
DASH
VoIP
Does DASH cause buffer bloat?
Will the quality of VoIP calls get affected by DASH flows?
And if yes, how can we solve this problem?
4
Our approach
• In order to answer the first two questions
– We perform experiments on a testbed in the lab to
measure the buffer bloat effect of DASH flows
• We developed a scheme SABRE: Smooth
Adaptive BitRatE to mitigate this problem
• We use the testbed to evaluate our solution
5
Measuring the buffer bloat effect
UDP traffic: 80kbps, pkt=150bytes
iPerf client
OTT VoIP traffic
iPerf server
RTT 100ms
1Gbps
HTTP Video server
6Mbps
(DSL)
Bottleneck emulator
Tail-drop: 256 packets
DASH client
Adaptive HTTP video flows have a
significant effect on VoIP traffic
6
Understanding the problem –
Why do we get large bursts?
1Gbps
6Mbps
TCP is bursty
7
Possible solutions
• Middlebox techniques
– Active Queue Management (AQM)
• RED, BLUE, CODEL, etc.
• RED is on every router but hard to tune
• Server techniques
– Rate limiting at the server to reduce burst size
• Our solution:
Smooth download driven by the client
8
Some hidden details
Client
Playout buffer
recv
1
HTTP GET
Server
DASH
player
OS
2
Socket buffer
Two data channels
In traditional DASH players:
•while(true) recv
•1 and 2 are coupled
9
Smooth download to eliminate bursts
Socket buffer
Idea
rwnd
TCP can send a burst of
min(rwnd, cwnd)
rwnd is a function of the empty
Two objectives
space on the receiver socket
Since we can not control
buffer
cwnd, then
control
rwnd
•Keep
socket
buffer almost full all the time
•Not to starve the playout buffer
Client
Playout buffer
DASH
recv player
HTTP GET
Server
OS
Socket buffer
10
Keeping the socket buffer full Controlling recv rate
Rate
While(1) recv
Rate
On
On
Off
While(timer) recv
Off
T
T
HTTP GET
HTTP GET
Server
Client
Socket
Playout
GET S1
S1
Server
S1
Off
GET S2
S2
Off
Bursty
S2
Client
Playout
Socket
GET S1
GET S2
Empty
socket buffer
11
Keeping the socket buffer full
HTTP Pipelining
# segments = 1 + Socket buffer size
#Segments = 1 +
Segment size
Server
Client
Socket
Playout
Server
Client
Socket
Playout
GET S1, S2
GET S1
S1
S1
Off
GET S2
S2
GET S3
S2
S2
Off
S1
S3
GET S4
Socket buffer is always full, rwnd is small
12
Still one more problem
• Socket buffer level drops temporarily when the
available bandwidth drops
Available BW
Socket buffer
Video bitrate
• This results in larger values of rwnd
– Can lead to large bursts and hence delay spikes
• Continuous monitoring of the socket buffer level
can help
13
Experimental results
UDP traffic: 80kbps, pkt=150bytes
iPerf client
OTT VoIP traffic
iPerf server
RTT 100ms
1Gbps
HTTP Video server
6Mbps
(DSL)
Bottleneck emulator
Tail-drop: 256 packets
DASH client
We implemented SABRE in the VLC DASH player
14
Single DASH flow constant available bandwidth
OnOff: delay > 200ms
about 40% of the time
SABRE: delay < 50ms
for 100% of the time
On/Off
SABRE
15
Video adaptation: how does SABRE
react to variable bandwidth?
Client
Playout buffer
recv
DASH
player
HTTP GET
OS
Server
Socket buffer
Rate
Socket buffer is
full
Players tries to upshift to a higher
bitrate, but can’t
sustain it
Available BW
Video bitrate
Socket buffer gets
grained, reduce recv
rate, down-shift to a
lower bitrate
Socket buffer is
full, can not
estimate the
available BW
Player can
support this
bitrate, shoot for
a higher one Time
16
Single DASH Flow –
variable available bandwidth
Rate 6Mbps
3Mbps
T=180
T=380
Time (sec)
On/Off
SABRE
17
Two clients
C1
Server
C2
Two On/Off clients
Two SABRE clients
18
Summary
• The On/Off behavior of adaptive video players can
have a significant buffer bloat effect
• We designed and implemented a client based
technique to mitigate this problem
• A single On/Off client significantly increases
queuing delays
• Future work:
– Improve SABRE adaptation logic for the case of
a mix of On/Off and SABRE clients
– Investigate DASH-aware middlebox and server
based techniques
19
Thank you!
Questions?
20
Backup slides
21
Loss probability
Random Early Detection:
Can RED help?
P=1
P=0
min
Avg queue size
max
Once the burst is on the wire, not much can be done!
How can we eliminate large bursts?
22
Single DASH Flow constant available bandwidth
SABRE
23
Single DASH flow constant available bandwidth
On/Off
SABRE
OnOff: delay > 200ms
about 40% of the time
SABRE: delay < 50ms
for 100% of the time
24
Single DASH Flow –
variable available bandwidth
Rate 6Mbps
3Mbps
T=180
T=380
Time (sec)
On/Off
SABRE
25
Single ABR Flow –
variable available bandwidth
On/Off
SABRE
26
Two clients
At least one OnOff
DASH client
significantly increases
queuing delays
27
Two clients
28
Download