Working Together to Address a More Complex Media Environment Prepared for: Chilean Media Industry July 2012 2 Agenda Meeting Purpose MRC Background Standards Setting International Work Audit and Methodology Issues Q&A 3 Meeting Purpose 4 Purpose In the last several years, the media environment, enabled by new consumer access and technology, has become increasingly complex. This has led to: Increased needs for standards in measurement – metrics, approaches, greater detail, etc. Growing support for MRC functions; other US associations seeking MRC’s leadership and assistance; for example “viewability” Significant increases in audit requests Some recognition of MRC internationally and requests for our involvement in other countries, mostly driven by multi-national MRC members and several measurement services The developing need to assess audiences across media types Increased need for education to promote quality media measurement practices 5 Purpose We are here today to introduce you to MRC, our structure, mission and the scope of our work. We want to give you a sense of the issues that are consuming our energy today. We want to make sure you understand the resources available to you – Standards, Auditing, etc. We ask you to consider participating in the MRC’s International activities. 6 MRC Background 7 History and Mission of the MRC U.S. Congressional Hearings Held in 1963-64 System of Self Regulation Established Founded as the “Broadcast Rating Council” Not-for-Profit Entity Voluntary Process for Measurement Services to: Supply Complete Information to the MRC Comply with MRC Minimum Standards Conducted the Service as Represented to Clients Submit to Annual Audits Pay for the Audit Costs (internal & external) Confidential Process Originally Recommended by the U.S. Congress 8 MRC Mission Statement To secure for the media industry and related users audience measurement services that are valid, reliable and effective; Set Standards; and Conduct Audits to Verify Compliance with Standards. 9 Resources MRC George Ivie, Executive Director (1/2000) Anthony Torrieri, SVP, Associate Director (3/2003) 8 years as VP of NAB Research, 9 years with TV Guide, Accounting background. Jim Peacock, Consultant Over 25 years of Agency experience. Extensive buyer-side knowledge and focus. David Gunzerath, SVP, Associate Director (7/2007) Former Ernst & Young Partner, Performed MRC Audits for 16 Years, Information Systems Focus. Over 30 years experience in media research. Fifteen years Head of Research at Arbitron. MRC Membership – Various Audit Committees Technical Committees Independent CPAs Ernst & Young – Specialized Team PricewaterhouseCoopers (Digital Audits Only) Deloitte & Touche (Digital Audits Only) 10 Significant Growth and Complexity Key Statistics in U.S. 1960s 1990 2000 Today MRC Members 15 15 48 145 Audits 4 5 6 75* * Includes >50 Digital Audits; Excludes International Work 11 MRC Member Organizations 12 MRC-Audited Products – U.S. Accredited ACT1 Radio TPP Arbitron Local Diary Service, Max, County Coverage, CSAR PPM – Atlanta, Cincinnati, Houston, Kansas City, Milwaukee, Minneapolis, Philadelphia, Phoenix, St. Louis Internet Audits ComScore Direct Nielsen Online Campaign Ratings Omniture Adobe Compliant Traffic Report Quantcast Quantified Publisher Visible Measures Ad-Serving: Cox Digital Solutions, AdTech, CBSi, Univision, Yahoo!, Disney, Atlas, DoubleClick (DFP, DFA), FreeWheel, GLAM Media, Google, LiveRail,Turner, AOL, MSN, 24/7, MediaMind, RealVu, Vindico, PointRoll, Auditude Clicks: Google, Yahoo!, MS adCenter, MS Media Console, Cox Digital Solutions KMR MARS Ibope Puerto Rico (Mediafax) Mediamark Research National Syndicated Study Media Monitors Radio Spot Service Nielsen Media Research National – NPM, Average Commercial Ratings MIT, NPower Local Set Meter Services, LPM Markets, Puerto Rico LPM Scarborough Simmons National Consumer Study TGI Puerto Rico Triton Digital Streaming (Ando Media) In-Process Internet Audits Google AdPlanner,Google DART Mobile, Digital Envoy, Quova, Adap.TV, TidalTV, Videology (TidalTV), Telemetry, DoubleVerify, Vizu, AdSafe Arbitron PPMs Austin, Baltimore, Charlotte, Chicago, Cleveland, Columbus, Dallas, Denver, Detroit, Greensboro, Hartford, Indianapolis, Jacksonville, Las Vegas, Los Angeles, Memphis, Miami, New York, Nashville, Norfolk, Orlando, Pittsburgh, Portland OR, Providence, Raleigh, Riverside, Sacramento, Salt Lake City, San Antonio, San Diego, San Francisco, San Jose, Seattle, Tampa, Washington DC, West Palm Beach BlackArrow comScore Media Metrix, Video Metrix Campaign Essentials Mendelsohn Affluent Study NEC Display Nielsen Local TV Diary Service, STB Data Nielsen NetView, NetView Hybrid Rentrak TV Essentials SQAD TV Costs Other – Ancillary Data Claritas, Sample Frame Vendors 13 Summary of Television Initiatives Traditional Auditing UEs National Orientations Local Orientations Enrichment of Data Beyond Households (Colleges, Group Quarters, Bars, Events) Emerging Perspectives – Driven by Changes in How Consumers Interact with Television Return Path Data, Hybrid/Integration Methodologies Seeking more stability and better insight into niche populations Fusion Online/TV Cross-Media (GRP) Orientations Monetization of Authorized and Unauthorized Video 14 Meter Technologies Audited Television: Television/Radio/Audio: AGB Tech – TVM2 and TVM5 (Mexico) Nielsen International – Eurometer (Ecuador, Costa Rica, Panama, etc.) Nielsen Media – AP Meter and Mark II (U.S.) IBOPE International – DIBII and DIBIV (Brazil, etc.) Telecontrol – TC7 and TC8 Arbitron – PPM (Canada, U.S.) Internet: Nielsen Online – NetView (PC and MAC OS) comScore Media Metrix – Cproxy 15 Standards Setting 16 Industry Standards are Critical MRC Mission – “Set Standards” Stop Relying on “Custom Criteria” Key Buyer Concerns Consistency Accuracy Enhanced Trust We have always undertaken these activities, but there is substantial increased demand/need across all media types; primarily driven by digitalization. 17 The MRC Minimum Standards (See www.mediaratingcouncil.org for detail) Originally Completed in 1964 Kept up to date by MRC Board Components: New BCP Standard Rolling Out in 2012 A: Ethical and Operational Standards B: Disclosure Standards C: Electronic Delivery/Software Standards Key Audit Benchmark 18 Existing Standards MRC Minimum Standards As Supplemented by Industry Specific Guidance NAB, ARF, IAB, ESOMAR, etc. MRC Media Monitoring Standards IAB-Specific Guidelines (Written by MRC) Impressions Display, Video, Rich Media, RIA, Process Addendum, AutoPlay Video Clicks Audience Reach In-Game Advertising Ad Verification Services MMA/IAB Mobile Web Guidelines (Written by MRC) Other Data Integration (MRC), Outdoor (MRC, ESOMAR, etc.) 19 Currently Under Development/Consideration Set-Top-Box Partner: NCC, MVPDs Mobile Partners: MMA, IAB, GSMA Partners: IAB, WOMMA Digital Audio Streaming Applications (In-Process; nearing completion) Messaging (Planned) Social Media Data Accumulation (draft posted on www.mediaratingcouncil.org) Partners: IAB, RAB, MMA, Others? Media Mix Modeling Partners: MPA, Others? Under Consideration SIGNIFICANT GROWTH IN DEMAND FOR THIS ACTIVITY 20 International Activity 21 MRC International Activity MRC Does Not Accredit services outside U.S. Rather, On Request MRC Serves as a Consultant on Rating-Service Audits which: Use MRC Minimum Standards as benchmark Are conducted by experienced MRC CPA auditors Have reasonable transparency for results – Important Have an appropriate local Committee present 22 International Audits Television: Radio: BBM Client Committee – Canada Radio Committee – Peru Print: ABAP-Redes – Brazil BBM Client Committee – Canada International Audit Committee (Cable) – Colombia, Argentina Client Committee – Costa Rica Technical Commission – Ecuador CIM – Mexico Client Committee – Panama CUSEA – Peru Valida – Chile CUSEMI – Peru Internet: Miaozhen Systems Audit Committee – China Note: Puerto Rico services receive full U.S. accreditation processes. 23 International Committee 24 Committee Role Began Activities in 2011 International Committee (full operating group) has the following administrative functions: Establish international audit priorities including new audits Assist with local-user and local-committee outreach Determine need for and priority of additional Standards Develop plan to promote international best practices Meet annually in person, more often via teleconference International organizations and media companies can become members to allow input in Committee functions and audits 25 Audit Related Functions Specific “Audit subcommittees” formed for each participating international service Would review new-audit scope and specifications Would then review audit results in detail Committee assists MRC Staff and provides guidance on recommendations for the audited service Work with measurement services on creating of audit-related action plans Would meet in person (and phone) each time a relevant audit report is issued Key Focus Areas: LATAM and Digital Much of our historical work has been in these areas 26 To be clear… We are NOT seeking to replace local audit committees; we want to work cooperatively with these committees We are NOT seeking to force U.S. policy on others; we believe we can learn from each other We are seeking to promote some fairly important attributes, we believe are of common benefit: Attributes Promoted Transparency of Methodology with Customers Disclosure of Operating Performance of Service to Customers, e.g., Response Rates Standard Errors Sample Distribution and Representation Error Disclosure Policy Disclosure of Sampling and Non-Sampling Issues Sound Internal Controls, Documentation, Policies and Procedures Security of Panel Verification of Methodology, Calculations, Weighting/Projections Building Process We are actively seeking members of the International Committee Who are we looking for: Common Interest in Quality Promoting Good Research, Not Business Interest Annual Dues -- $12,500, with 3-year Commitment for Media Entity; TBD for a Country Audit Committee Strict Commitment to Non-Disclosure Willingness to Actively Participate Types of Issues Addressed 30 Sample of Key Issues -- Traditional New Technology Challenges Sample Frame Coverage; Probability Methods, etc. Data Collection Quality Content Source Confusion Digital Print Consistency of Ad Content Response Rates Sample Distribution/Sample Sizes/Reliability Race/Ethnic/Young Representation Consumer Measurement Recruitment Techniques Accuracy of Universe Estimate Data Editing Quality / Ascription Cross-Media Comparability 31 Sample of Key Issues – New Media Ad-Centric or Site-Centric (Census) User-Centric (Panel) Common and Hybrid •Users vs. Computers •Cookie Deletion •Sufficiency of Client Side Counting •Auto-Refresh, Non-Human Traffic •Internal Traffic •In-Session Gaps •New Types of Usage, WAP/Mobile, etc. •International Traffic •Cross Domain I-Frames •Viewability •Panel Representation •Meter Coverage •Capturing All Access Locations •Metering All Computers of Panelists •Initial Demo Data Collection •User In-Session Identification •Sample Sizes, Standard Errors •Non-Response Levels •Internal Controls •Editing, Calibration, Weighting •Transparent Methodology 32 Thank You! Follow-up contact or questions are welcome: George Ivie Executive Director, MRC givie@mediaratingcouncil.org +1 (212) 972-0300 http://www.mediaratingcouncil.org 33