15-744 Computer Networking Lecture 18 – Internet Video Delivery Matt Mukerjee Slides: Hui Zhang, Peter Steenkiste, Athula Balachandran, Srini Seshan, et. al. The Internet Today… Video Video Could be Video Probably Video Encrypted Video? Purchasing Videos? Video in the name! DEFINITELY Video! Sharing Video! Ads! (and Videos!) 2014 1H – NA Fixed Access 2 The Internet Today… Video Video Could be Video Probably Video Encrypted Video? Purchasing Videos? Video in the name! DEFINITELY Video! Sharing Video! Ads! (and Videos!) 2014 1H – NA Fixed Access 3 Demystifying Video Delivery Video Source Screen Video Player ??? Internet What’s this? Today’s lecture 4 Outline • • • • • Background on Video History of Internet Video Delivery CDNs & the World Today Content Brokers & Quality of Experience Live Video Delivery 5 Terminology • Frame • Resolution • Number of pixels per frame • 160x120 to 1920x1080 (1080p) to 4096x2160 (4K) • FPS (frames per second) • 24, 25, 30, or 60 • Bitrate • Information stored/transmitted per unit time • Usually measured in kbps to mbps • Ranges from 200Kbps to 30 Mbps 6 Internet Video Requirements • Smooth/continuous playback • Elasticity to startup delay: need to think in terms of RTTs • Elasticity to throughput • Multiple encodings: 200Kbps, 1Mbps, 2 Mbps, 6 Mbps, 30Mbps • Multiple classes of applications with different requirements Delay 2, N-way conference Bandwidth < 200 ms 4 kbps audio only, 200 kbps – 5 Mbps video Examples Skype, Google hangout, Polycom, Cisco Short form VoD < 1-5s 300 kbps – 2 Mbps & higher Youtube Long form VoD < 5-30s 500 kbps – 6 Mbps & higher Netflix, Hulu, Qiyi, HBOGO Live Broadcast < 5-10s 500 kbps – 6 Mbps & higher WatchESPN, MLB Linear Channel < 60s 500 kbps – 6 Mbps & higher DirectTV Live 7 Avoid Buffering like the Plague! Buffering… 51% 8 1990 – 2004: 1st Generation Commercial PC/Packet Video Technologies • • • • Simple video playback, no support for rich app Not well integrated with Web browser No critical mass of compelling content over Internet No enough broadband penetration 9 2005: Beginning of Internet Video Era 100M streams first year Premium Sports Webcast on Line 10 2006 – 2015: Internet Video Going Prime Time 2006 2007 2008 2009 2010 2011 11 First Generation: HTTP Download • Browser request the object(s) and after receiving them pass them to the player for display • No pipelining 12 First Gen: HTTP Progressive Download (2) • Alternative: set up connection between server and player; player takes over • Web browser requests and receives a Meta File (a file describing the object) instead of receiving the file itself; • Browser launches the appropriate player and passes it the Meta File; • Player sets up a connection with Web Server and downloads or streams the file 13 Meta file requests 14 Buffering Continuous Media • Jitter = variation from ideal timing • Media delivery must have very low jitter • Video frames every 30ms or so • Audio: ultimately samples need <1ns jitter • But network packets have much more jitter that that! • Solution: buffers • Fill buffer over the network with best effort service • Drain buffer via low-latency, local access 15 Max Buffer Duration "Bad": Buffer overflows = allowable jitter Max Buffer Size File Position Streaming, Buffers and Timing Buffer Duration "Good" Region: smooth playback Buffer Size "Bad": Buffer underflows and playback stops Buffer almost empty Time 16 Drawbacks of HTTP Progressive Download • HTTP connection keeps data flowing as fast as possible to user's local buffer • May download lots of extra data if user does not watch the entire video • TCP file transfer can use more bandwidth than necessary • Mismatch between whole file transfer and stop/start/seek playback controls. • However: player can use file range requests to seek to video position • Cannot change video quality (bit rate) to adapt to network congestion 17 2nd Generation: Real-Time Streaming • Build our own protocol! This gets us around HTTP; app layer protocol can be better tailored to Streaming Multimedia 18 Example: Real Time Streaming Protocol (RTSP) 19 Drawbacks of Real-Time Streaming • Per-client state! • Streaming protocols often blocked by routers • HTTP delivery can use ordinary proxies and caches • Conclusion: rather than adapt Internet to streaming, adapt media delivery to the Internet 20 3rd Generation: HTTP Streaming • Client-centric architecture with stateful client and stateless server • Standard server: Web servers • Standard Protocol: HTTP • Session state and logic maintained at client • • • • Video is broken into multiple chunks (like with bittorrent) Each chunk can be played independent of other chunks Playing chunks in sequence gives seamless video Meta-file (manifest) provides chunk file names 21 HTTP Chunking Protocol Manifest: A1, A2, … Server Client Manifest HTTP Adaptive HTTP GET Manifest Player A1 A1 A2 … Manifest A1 A2 … Cache Web browser Web server HTTP HTTP TCP TCP Web server 22 HTTP Chunking Protocol Manifest: A1, A2, … Server Client Manifest HTTP Adaptive HTTP HTTPGET GETA2A1 Player A1 A A11 A A22 … A1 A2 … Cache Web browser Web server HTTP HTTP TCP TCP Web server 23 Adaptive Bitrate with HTTP Streaming • Encode video at different quality/bitrate levels • Client can adapt by requesting chunks at different bitrate levels • Chunks of different bitrates must be synchronized • All encodings have the same chunk boundaries and all chunks can be played independently, so you can make smooth splices to chunks of higher or lower bit rates 24 HTTP Chunking Protocol Client Manifest: LQ: A1, A2, … HQ: B1, B2, ... Server Manifest HTTP Adaptive HTTP 2 1 HTTPGET GETBA Player A1 Cache B1 A A11 A2 … B1 B B22 A1 A2 … B1 B2 … Web browser Web server HTTP HTTP TCP TCP … Web server 25 Advantages of HTTP Streaming • Easy to deploy: it's just HTTP! • Work with existing caches/proxies/Webservers/Firewall • Fast startup by downloading lowest quality/smallest chunk • Bitrate switching is seamless • Many small files • Small with respect to the movie size • Large with respect to TCP • 5-10 seconds of 1Mbps – 3Mbps 0.5MB – 4MB per chunk 26 Demystifying Video Delivery Video Source Screen Video Player ??? Internet 27 Demystifying Video Delivery Video Source Screen Video Player Encoders & Video Servers ISP & Home Net CMS and Hosting ??? Internet 28 Demystifying Video Delivery Video Source Screen Video Player Encoders & Video Servers CMS and Hosting ISP & Home Net Content Delivery Networks (CDN) 29 Outline • • • • • Background on Video History of Internet Video Delivery CDNs & the World Today Content Brokers & Quality of Experience Live Video Delivery 30 Content Delivery Networks (CDNs) • Goal: deliver content from optimal locations • (Generally: users in Pittsburgh go to Pittsburgh server, users in Tokyo go to Tokyo server, …) • Customers (e.g., CNN, ESPN, …) pay CDN to deliver their content • Variety of companies: Akamai, Limelight, Level 3, … • Akamai is one of the most well known • 170,000+ servers in 102 countries • 15-30% of internet traffic 31 How Akamai Works cnn.com (content provider) DNS root server Akamai server Get foo.jpg 11 Get index. html 1 10 2 3 4 5 Akamai high-level DNS server 6 7 Akamai low-level DNS server 8 Akamai server 9 End-user 12 Get /cnn.com/foo.jpg 32 Akamai – Subsequent Requests cnn.com (content provider) Get index. html 1 DNS root server Akamai server Akamai high-level DNS server 2 7 Akamai low-level DNS server 8 Akamai server 9 End-user 12 Get /cnn.com/foo.jpg 33 Outline • • • • • Background on Video History of Internet Video Delivery CDNs & the World Today Content Brokers & Quality of Experience Live Video Delivery 34 What’s wrong with this picture? Video Source Screen Video Player Encoders & Video Servers CMS and Hosting ISP & Home Net Content Delivery Networks (CDN) 35 Closing the Loop with Analytics Content Broker Video Source Screen Video Player Encoders & Video Servers CMS and Hosting ISP & Home Net Content Delivery Networks (CDN) 36 Closing the Loop with Analytics Video Source Screen Video Player Encoders & Video Servers CMS and Hosting ISP & Home Net Content Delivery Networks (CDN) 37 Why Analytics? • Analytics help judge users’ Quality of Experience (QoE) • Users’ perception of quality greatly impacts engagement • (and thus ad revenue) Conviva 38 How can Conviva Help increase QoE? • Runs code directly in clients browser, from customer website (e.g., CNN, ESPN, …) • Gathers historic and real-time data from each client • Predict which CDN will provide the best QoE for a given client • (and send the client there) • Find the optimal time to switch bitrates 39 Conviva Architecture 40 Simple, Right? • Turns out no single metric is a good indicator of QoE! 41 Commonly used Quality Metrics Join Time Buffering ratio Rate of switching Rate of buffering Average Bitrate 42 Commonly used Quality Metrics • Need some function that maps metrics to QoE • Use machine learning? ✓ • Biggest problem… confounding factors! • (e.g., live video and VoD behave very differently) 43 Challenge: Confounding Factors Live and VOD sessions experience similar quality 44 Challenge: Confounding Factors However, user viewing behavior is very different 45 Confounding Factors Device Type of Video Popularity Location Connectivity Time of day Day of week Need systematic approach to identify and incorporate confounding factors 46 How to Rank Confounding Factors? • Relative information gain • ML for “how key is a particular factor” • Decision tree structure • Does this factor drastically change the DT structure? • Tolerance • Do users tolerate changes in this factor more easily? 47 Identifying Key Confounding Factors Factor Relative Decision Information Tree Gain Structure Tolerance Level ✓ ✗ ✓ ✗ ✓ ✗ Time of day ✗ ✗ ✗ ✗ ✗ ✓ ✗ ✗ ✗ ✓ ✓ ✓ Day of week ✗ ✗ ✗ Type of video Popularity Location Device Connectivity 48 Identifying Key Confounding Factors Factor Relative Decision Information Tree Gain Structure Tolerance Level ✓ ✗ ✓ ✗ ✓ ✗ Time of day ✗ ✗ ✗ ✗ ✗ ✓ ✗ ✗ ✗ ✓ ✓ ✓ Day of week ✗ ✗ ✗ Type of video Popularity Location Device Connectivity 49 Results • Use important confounding factors to create separate QoE estimator models (using ML) • 20% increase in user engagement 50 Outline • • • • • Background on Video History of Internet Video Delivery CDNs & the World Today Content Brokers & Quality of Experience Live Video Delivery 51 Live Video • Live video is becoming immensely popular • Demand increases 4-5x every three years • Expected to reach 50 Tbps this year • Amazon buys Twitch for ~$1Billion • Users watch 150b min of live video per month • Single World Cup stream = 40% of Internet traffic 52 Live Video Challenges • • • • Amount of Demand Can’t Cache Workload Variety (e.g. mega-events / longtail) Users want: • Good QoE (good bitrate, buffering ratio, join time) • CDNs want: • meet user demand, minimize delivery cost • WAN imposes: • latency, packet drops, failures, … 53 CDN Live Video Delivery Background Video Sources Data Response Interior Clusters HTTP GET Edge Clusters Video Requests Distribution Trees Channel 1 Channel 2 CDN Live Video Delivery Background CDN( A( C( B( D( Sources( Reflectors( Edges( Video(Requests( DNS( AS/Clients( Chan.(1( Chan.(2( High(Cost( Low(Cost( Questions? 56 15-744 Computer Networking Lecture 18 – Internet Video Delivery Matt Mukerjee Slides: Hui Zhang, Peter Steenkiste, Athula Balachandran, Srini Seshan, et. al.