National Science Foundation Division of Science Resources Statistics A face-lift or reconstructive surgery? What does it take to renew a 53 year old survey? International Conference on Establishment Surveys Montreal, Quebec, CA June 19, 2007 Jeri Mulrow, John Jankowski, Brandon Shackelford, Ray Wolfe National Science Foundation Division of Science Resources Statistics www.nsf.gov/statistics/ National Science Foundation Division of Science Resources Statistics Warning! Applied Statistician On Board National Science Foundation Division of Science Resources Statistics What does it take? • Humor • P&P: Patience & Perseverance • Good Listening Skills National Science Foundation Division of Science Resources Statistics What does it really take? • Separate staff • Comprehensive, systematic approach • Coordination & cooperation National Science Foundation Division of Science Resources Statistics NSF Survey of Industrial R&D Background • Collects information on R&D performance • Started in 1953 and conducted annually since 1957 by U.S. Census Bureau • Intended to cover all for-profit, public or private, non-farm companies with 5 or more employees operating in the U.S. • Company based (not establishment level) • Response rates ~ 80% National Science Foundation Division of Science Resources Statistics Business: Then and Now • • • • • • 1950s Government largest source of R&D $$$ Manufacturing Large companies dominate R&D $$$ Business largest basic research performer Domestic focus Focus on in-firm S&T resources (central research labs) • • • • • • 2000s Business largest source of R&D $$$ Services Large companies not as dominant Academia largest basic research performer Global focus Increased leveraging of S&T resources outside the firm National Science Foundation Division of Science Resources Statistics Why redesign? Committee on National Statistics (CNSTAT) 2005 Recommendation --- Time to redesign • Convene panel of R&D Industry experts • Explore potential to collect information below the company level (business segment) • Examine record-keeping practices of companies • Increase collection efficiencies • Revise editing system • Improve data quality National Science Foundation Division of Science Resources Statistics Getting started • A bit piecemeal • A variety of quality improvement activities aimed at the existing survey components • Lacked a framework for pulling the pieces together National Science Foundation Division of Science Resources Statistics On the right track – Separate staff • On-going survey staff – keep the trains running on the current track • Redesign staff – evaluate and rethink all aspects of the survey • Matrix management National Science Foundation Division of Science Resources Statistics On the right track – Comprehensive, Systematic Redesign GOAL: To provide high quality, timely, useful information on research and development in the U.S. private sector. 1. Content • • • • Define data and information needs Identify new data sources Identify new content of interest Evaluate availability of new content 2. Collection • • Evaluate current operations and methodology Identify new ways of collecting data 3. Implementation National Science Foundation Division of Science Resources Statistics Content: Define Needs, Identify Sources & Evaluate Availability Industr y Expert Panel Methodological Work – Response Studies, Cognitive Interviews, Respondent Debriefings Record Keeping Study DATA & INFORMATION Data User Needs Workshops BEA data gaps National Science Foundation Division of Science Resources Statistics Industry Expert Panel High level (VP of R&D or equivalent) representation from the following companies • A123 Technologies • Air Products and Chemicals • Colgate-Palmolive • Corning • General Motors • Google • • • • • • • Hershey Foods Hewlett-Packard Lockheed Martin Pfizer SAIC T/J Technologies Wachovia Securities National Science Foundation Division of Science Resources Statistics Industry Expert Panel Held three meetings – general to more specific •April 6, 2006 • Past and future changes in R&D • Drivers of change • How R&D is tracked and evaluated • August 14, 2006 • Globalization of R&D • R of collaboration in R&D • November 3, 2006 • R&D definitions • R&D data needs from user workshops National Science Foundation Division of Science Resources Statistics Data User Workshops • Federal data users (9/20/06) – BEA, BLS, NIST, ERS/USDA, FRB, DoE, GAO • Nonfederal data users (10/18/06) – Accounting firms, academia, think tanks, scientific press, state gov’t Four objectives: 1. 2. 3. 4. Identify how and why R&D data are used Identify gaps in the current data on R&D Provide insight into specific topics Create draft set of data priorities National Science Foundation Division of Science Resources Statistics Data User Workshops - Agenda 8:45 – 9:00 9:00 – 9:30 9:30 – 10:15 10:15 – 10:45 10:45 – 11:00 11:00 – 11:30 11:30 – 12:00 12:00 – 1:00 1:00 – 1:30 1:30 – 2:15 2:15 – 2:30 2:30 – 3:00 3:00 – 3:30 3:30 – 4:15 4:15 – 4:30 Registration and coffee Welcome, Introductions, Administrative Remarks Pre-Workshop Survey Results Big Picture Discussion and Overview of Goals Large Group Discussions: Big Picture Questions Break Small Group Discussions: Data Gaps Small Group Report Out Lunch Specific topics presentation Small Group Discussion: Specific Topics Break Small Group Report Out Large Group Discussion Prioritization Exercise and Discussion Wrap-up, Closing Comments, Next Steps National Science Foundation Division of Science Resources Statistics Recordkeeping Study • Preparations (October 2004- June 2005) • Background interviews and project design • Question and protocol development • Phase 1: (July-September 2005) • Broad understanding of firms’ R&D definitions, organization and availability of records • Phase 2: (June-August 2006) • Detailed review of financial recordkeeping practices • Phase 3: (TBD) • Recordkeeping practices on specific priority areas National Science Foundation Division of Science Resources Statistics Synthesis of Methodological Work • Synthesis report of all cognitive work over past 5 years covering 19 different, essentially independent, studies: – Cognitive interviews – Respondent debriefings – Special follow-ups on specific topics • Summarized results by survey topic – Findings and actions taken – Resolved issues – Unresolved issues – Outstanding issues National Science Foundation Division of Science Resources Statistics Collection: Evaluate & Identify New Techniques • • • • • • • Flowchart of entire survey process Frame creation and sample design Contact strategies (initial & nonresponse) Unit and item nonresponse rates Analysis techniques Edit methodology Imputation research National Science Foundation Division of Science Resources Statistics Data Sources & Availability Survey Operations & Methods New Content & New Methods Flexible Collection Platform Data / Information Needs New Methods & Procedures Putting it all together Modular Approach to Content Multiple Respondents Within Company Add Web-based Instrument National Science Foundation Division of Science Resources Statistics What’s next? Pretest content Pilot content Dress Rehearsal -- all aspects Evaluate Evaluate Pilot; Develop procedures, edits, imputes Full Implementation Continuous Improvement National Science Foundation Division of Science Resources Statistics How do we make it all work? • Joint Investigative Teams (Census & NSF) – – – – – – – – – – Content and Industry Coverage Sample Design, Classification, Methodology, and Estimation Cognitive Testing of Survey Materials Contact Strategies Data Collection and Capture Editing and Imputation Analytical Review Tabulation and Dissemination Continuous Improvement, QC &QA Integration Strategy National Science Foundation Division of Science Resources Statistics Challenges Ahead 11 separate Joint Investigation Teams -• Staying focused and coordinated • Keeping a systems point of view -- everything is connected and is only as good as the weakest link • Getting the right people on the right teams at the right time • Defining and measuring success National Science Foundation Division of Science Resources Statistics Lessons Learned Quotes from Thomas Watson, Sr. – IBM • Knowledge creates enthusiasm. • We must never think that what we have today will satisfy the demand ten years from now. • It is better to aim at perfection and miss it than to aim at imperfection and hit it. • Analyze the past, consider the present and visualize the future.