www.anite.com/nemo INDOOR/OUTDOOR ACCEPTANCE AND BENCHMARKING MEASUREMENTS & ANALYSIS WALKER AIR, INVEX II AND XYNERGY JARI RYYNÄNEN, DIRECTOR OF PRODUCT MANAGEMENT AND BUSINESS DEVELOPMENT, EMEA REGION BRUNO POISSON, REGIONAL DIRECTOR SEPTEMBER 2015 © 2015 HOW TO MAKE ACCEPTANCE, VERIFICATION AND BENCHMARKING MEASUREMENTS FOR AIRPORTS, SHOPPING MALLS, STADIUMS, ENTERPRISES, TRAINS, ETC. ? © 2015 Page 2 BACKPACK MEASUREMENT SOLUTION ● Light and easy-to-carry measurement solution for indoor testing and benchmarking ● Runtime of the complete system with battery back: 10 hours ● One charging cable for the whole system (12V/220V/110V) NEMO WALKER AIR - INTRODUCTION ● Lightweight benchmarking measurement system based on Nemo Handy-A ● One tablet acting as a Master Control Unit ● Up to 6 Slave units performing measurements – Support for the latest terminal models – Samsung S4, S5, S6, Note ¾, etc. – GSM, WCDMA, HSPA+, LTE, LTE-A CAT-6 ● Support for scanners: PCTEL IBflex and DRT 4311B ● Centralized control of Slave units from the Master – – – – – Synchronized time and scripting Status display of all units Indoor marker sharing to Slaves Timestamped textual markers Pop-up questionnaires ● Supports all the latest testing applications, including Volte, CSFB, VQ POLQA and PESQ, Youtube, Facebook, LinkedIn, Instagram, Twitter, Dropbox, FTP/HTTP data transfer, HTML browsing, iPerf, etc. ● Supports iBwave floor plans and outdoor maps: BTS and antenna locations, zones, etc. ● Log files can be uploaded to FTP/HTTP server or collected from the Slave to Master unit 4 MULTIPLE DEVICE MONITORING FROM TABLET © 2014 Page 5 SMALL CELL / DISTRIBUTED ANTENNA SYSTEM TESTING SCENARIOS eNodeB / Antenna Validation Floor Testing Testing of Entrances © 2014 Page 6 • Coverage – LTE / HSPA • Verification of small cells / antennas of DAS transmitting • Check of planned channels (PCI/SCR codes) • • • • • Accessibility (Applications) Retainability (Applications) Mobility (Handovers) Throughput Macro Ingress • Handovers and cell reselections between macro layers and indoor system IBWAVE FLOOR PLANS SUPPORT DISTRIBUTED ANTENNA SYSTEMS • • Supports multiple floors per building Includes BTS and Distributed Antenna System’s (DAS) locations for acceptance testing © 2015 Page 7 COMPLETE TOOL SET FOR INDOOR VERIFICATION AND ACCEPTANCE TESTING, AND BENCHMARKING, NEMO WALKER AIR Shopping Malls, Stadiums, Airports… 1. Data Collection and Loading © 2015 Page 9 2. Reporting 3. Analysis NEMO INVEX II BENCHMARKING SOLUTION Nemo Invex Software Nemo Invex Hardware Nemo Invex II Nemo Outdoor Capable of supporting a large number of measurement devices: handsets, scanners and USB modems Multiple chassis can be combined for even higher device count Based on proven Nemo Outdoor platform Rich suite of applications for benchmarking Can be utilized also for drive testing Customizable user interface, possibility to customize data views and workspaces Nemo Invex Remote Control Nemo Media Router Application Proprietary communication interface & application developed for Android based smart phones. Patent pending! Enables voice quality PESQ & POLQA, VoLTE and data testing measurements on smartphones without any additional hardware Nemo Server Enables the user to perform both data and voice quality measurements with Nemo Outdoor, Nemo Invex, Nemo Autonomous and Nemo Handy © 2014 Page 10 NMR Inside Nemo Commander Centralized remote control of field measurement units via Nemo Commander Post Processing Nemo Xynergy Nemo WinCatcher Nemo Analyze NEMO INVEX II HIGHLIGHTS ● Future proof platform supporting higher data rates – including HSPA+ and LTE-A Cat 6 and beyond ● More capacity – Higher test device density per CPU – Max 50 UEs + 3 scanner receivers (Nemo FSR1, PCTEL, R&S, DRT, Venture) ● Bigger smartphones (Samsung Galaxy Note 4, etc.) ● Improved reliability and improved usability/ field installation through complete accessory sets ● QoS/QoE testing, including latest voice and video algorithms (PESQ ITU-T P.862, POLQA ITU-T P.863, PEVQ-S) ● Power efficiency – Reducing the power per test device ~50% ● WIFI support ● Customizable user interface, possibility to customize data views and workspaces Remote control via Nemo Commander for enhanced project quality and lower costs 11 BENCHMARKING - APPLICATIONS ● Operators wants to know what is the competitive positioning of the mobile network services with different applications ● Available E2E QoE & QoS Measurements: Standard Applications/ Protocols: – Voice call, video call, video streaming, YouTube video streaming – SMS, MMS and USSD measurements – ICMP Ping/ Ping trace route – FTP testing for file transfers (multithreaded) – SFTP testing for file transfers (download and upload) – HTTP/S testing for file transfers (download and upload) – HTTP/S testing for web browsing – SMTP/POP3/IMAP for email testing – Iperf for UDP/TCP testing – Facebook testing © 2014 Page 13 – IP packet capturing and playback Optional Applications/ Protocols: – Voice quality PESQ and POLQA (ITU-T P.862.1, WB-AMR, P862.2, ITU-T P.863) – VoLTE voice call support (Qualcomm IMS, Samsung IMS), CS fallback support – PVI video quality streaming, PEVQ-S video streaming analysis measurements – SIP-based VoIP call testing – PTT call testing (QChat and Kodiak InstaPoC™) – Nemo Media Router (On device measurements) – FTP/ HTTP/ Ping/ Browsing – Youtube – Twitter – Instagram – LinkedIn WHAT TO MEASURE, WHAT TO BENCHMARK? VOICE CALL KPIS ● Radio Network Unavailability (ETSI TS 102 250-2) – Definition: (Probing attempts with mobile services not available) / (all probing attempts) x 100 – Probing attempt: ● GSM: check C1-criteria ● GPRS: check GPRS specific information within SI 3 ● WCDMA: Check S-criteria – Probing attempt NW not available: technical condition not met – Probing attempt NW available ● GSM: C1-criteria > 0 ● GPRS: GPRS specific information within SI 3 exists ● WCDMA: S-criteria satisfied ● Call block rate (Telephony service Non-Accessibility, ETSI TS 102 250-2) – Definition: (unsuccessful call attempts) / (all call attempts) X 100 ● Start trigger point: Push send button or script command, ● Stop trigger point: Alerting received by A-party ● Call drop rate (Telephony Cut-off Call Ratio, ETSI TS 102 250-2) – Definition: (unintentionally terminated telephony calls ) / (all successful telephony call attempts) X 100 ● Start trigger point: Alerting received by A-party, ● Stop trigger point: Intentional release of connection. © 2014 Page 14 WHAT TO MEASURE, WHAT TO BENCHMARK? VOICE CALL KPIS, EXAMPLE ● Voice quality (Polqa MOS, ITU-T P.863) – MAX value in good SNR situation should be around 4.5, MIN value is 1 (ITU-T P.863) – MOS can be shown and analyzed on Call Basis or on Sample basis (ETSI TS 102 250-2): ● Call Setup time (ETSI TS 102 250-2) – Definition: Time from Call attempt event (Push send button or script command) to Alerting received by A-party (CAC2) © 2014 Page 15 WHAT TO MEASURE, WHAT TO BENCHMARK? ● Service Accessibility – – – – – – DNS host name resolution time (ms) Streaming video service access time Streaming setup delay Streaming transfer attempts Streaming setup success Streaming setup success rate ● Service Retainability – – – – Streaming Streaming Streaming Streaming buffering count transfer failures transfer success transfer success rate ● Service Integrity – Streaming Transfer Time (ms) – Streaming completion success rate – Application data throughput DL © 2014 Page 16 WHAT TO MEASURE, WHAT TO BENCHMARK? ● Service Accessibility – – – – – – – – DNS host name resolution time (ms) Service access time Facebook connection attempts Facebook connection success Facebook connection failures Facebook connection attempt success rate Facebook connection success rate Facebook connection time ● Service Retainability – – – – – Facebook Facebook Facebook Facebook Facebook disconnects (dropped) disconnects (normal) transfer attempts transfer failures transfer success ● Service Integrity – Facebook transfer success rate – Facebook transfer time – Application data throughput DL/ UL © 2014 Page 17 DATA TESTING, LTE CAT-6 WITH CARRIER AGGREGATION (3GPP REL-10) ● Nemo Outdoor platform provides true and accurate data throughput metrics without any limitations ● Application total data throughput (~290Mbps) and PDSCH data throughput values per cell/carrier. ● Measured with Samsung Note 4 ● FTP client was used. FTP client was running on a phone (NMR solution). © 2014 Page 21 REPORTING, NEMO XYNERGY SYSTEM MODULES SON, Troubleshooting, Optimisation and Reporting Rules, Analytics, and Correlation End-to-End Analysis Acceptance Planning Optimization Acceptance Benchmark Optimization Drive Test © 2014 Page 23 Small Cells PM CM FM Others Network Management OSS Trace Others Customer Experience (CEM) XYNERGY DT BENCHMARKING – REPORT VIEWS ● Benchmarking dashboards provide clear comparative view between operators for established KPIs: 26 XYNERGY NM – ONLINE ANALYSIS – OMC PERFORMANCE MANAGEMENT (PM) DATA ● Sample view showing KPI plot on map with synchronized charts and associated sector-carrier parameters ● OMC and drive test data can be cross-correlated BENCHMARKING CASE STUDIES AROUND THE WORLD Regulator 1 in Asia Regulator in Greece Nemo Invex I Nemo Invex II, Walker Air Invex II used for benchmarking by drive test and fixed location testing Walker Air used for Indoor Explorer IV for fixed location autonomous testing 3 operators, 2G/3G/4G voice, data and coverage testing Polqa MOS for voice, FTP, HTTP browsing, Email, coverage, network availability Results will be published in internet • • • • • • • • • • • Operator in Australia Regulator 2 in Asia Nemo Walker Air • • Nemo Walker Air used for Indoor, Outdoor and mountain coverage measurements Voice, data and coverage measurements Nemo Invex I and II • • • • © 2014 Page 30 Invex I used for benchmarking by drive test 3 operators, 3G/4G voice, data and coverage testing PESQ MOS for voice, FTP, HTTP browsing, SMS, coverage Handy used for indoor coverage measurements in idle mode Results are published in internet Invex I and II used for benchmarking by drive test 2G/3G/4G voice, data and coverage testing Polqa for MOS, FTP, HTTP, SMS, Youtube, Ping, coverage testing LTE CAT-6, MIMO performance testing BENCHMARKING FOR REGULATORS IN A NUTSHELL 1. Automation (collection and reporting) provides many benefits – – – – Dependable & reliable results Lower cost base Enhanced quality On the fly results 2. Remote Control enhances DT quality and ensures project is under control – – In average 15% of DT fail On the fly reports enable swift corrections of quality issues 3. Carry out at the same time operators benchmarking and coverage verification (with scanners) – Mutualized costs ensure reduced total costs 4. Focus on testing the most important services – – – – Voice Quality (MOS) in 2G/3G/4G SMS YouTube testing & Video MOS Facebook testing 5. Test all important environments – © 2014 Roads, hot-spots, indoor (enterprise, consumers), trains, metro, etc. Page 31 www.anite.com/nemo Contact the team talk read write © 2014 +358 50 395 7700 anite.com/NEMO NEMO@anite.com Page 32