What does Elsevier count? Use Measures for Electronic Resources: Theory & Practice ALCTS Program, ALA, Chicago Daviess Menefee Director, Library Relations, Americas Date: June 27, 2005 My Agenda for today Part 1: Some history on usage reporting with ScienceDirect Part 2: Present some of the business aspects of usage from a publisher’s point of view Part 3: Look at a couple of interesting results on usage and its reporting from our internal studies 2 Not too long ago…. ScienceDirect usage reports were: Word Documents Derived by processing logfiles on a PC, over a single weekend Provided only the barest of data Began with 6 insisting customers Delivered via email by the assigned Account Manager Never detailed what was not used. 3 Did we know what we were doing? We thought so. Some internal concern over the potential impact of usage reports on subscriptions No standard or benchmarks to follow Navigating in the dark Had high hopes to advance the “science” of usage reporting and analysis. But we didn’t know what that was. Usage reporting grew organically from the demands of the market as well as the business. 4 And then we made some mistakes. Usage reporting and transactional downloads were not fully reconcilable. Two different systems in play. Not including 0 usage in the reports caused a degree of unreliability in the reports. Trying to define sessions in a session-less state was not very productive. 5 But then we did some good things. Published a white paper indicating that we were reducing the number of downloads based on user behavior. (Our sales staff were not pleased.) Invested heavily in designing and implementing a state of the art system that could provide reports directly to the customer. Supported the establishment of Counter. 6 Did we learn anything? Yes, about customer reporting. We have improved as an industry in defining and delivering them. We have come a long way. We have created a trusted third party group to monitor and audit the reports. We continue to study user behavior and try to understand it better. 7 Part 2: Why does Elsevier count usage? Data for business information such as Trends Product Performance Return On Investment Need Data for Informed decisions Determine Future Directions, ex: Pricing 8 Elsevier Management Reports Produced Monthly Summarize Key Performance Indicators Indicators: Major Areas Content System Customers Other (Links, trials, Web Editions, etc.) 9 Performance Indicators for SD Content Indicators Number of Journals Number of Abstracts Number of full-text articles available System Performance Number of Page Requests Total Full-text articles downloaded, PDF/HTML Total Articles incl. SD On Site Total Searches 10 Performance Indicators, 2 Customers Number of Contracts Number of Registered Accounts Number of Active Accounts Estimated number of user sessions Number of Active Users (cookie based) 11 More on Performance Indicators “Other areas” measured: Trial Customers Guest Usage Esp. Article downloads (PPV) Web Editions (limited to customer base statistics) Promotional Usage Scirus (no. of searches and indexed pages) Linking Indicators 12 What is the point? Company has set target numbers for most areas of the KPI’s. Change our thinking from traditional publishing to how to grow an electronic journals--books business. Enables the setting of objectives and priorities. Publishing units now have usage goals. 13 Product Management Reports Opportunity for product managers to review and comment on trends or explain why a number is out of proportion. Examples: Usage of abstracts decreased during the month but the number of guests users increased. MathML increased this month over last month and points to a trend of continuing growth. 14 Data is converted to graphs ScienceDirect Usage--One Month Home Page Other Journal/Book/MRW lists Journal/Book homepage visits Subject corners Gateway entries Search activity Journal issues / TOC pages Search result lists Database abstracts Articles 15 What’s worth counting? Just about everything that involves end users and content. Full-text Articles are the norm but Elsevier also continues to monitor browsing behavior especially from guest users (a possible new market). Important to look at changes and how that effects any usage (training, system changes, etc.) 16 Part 3: Some Internal Studies 1. Referring URL Study Who is sending us the traffic? What are the subject areas where users come into the product? 2. Usage on Usage Study Are these reports really being used? Who is using them? What triggers use? 17 Traffic Referrals to SD Major Sites (95%): Customers’ OPACs PubMed Elsevier Site – Cell Press Cross Ref ChemPort Search Engines (Scirus) 18 Referrals as a Chart Customer PubMed Elsevier Cross Ref ChemPort Search Engines 19 SD Entry by Subject Areas Economics Eng ineering Chemical Eng Bus iness&Mgmt Materi als Sci Chemistry Dec ision Sci Ene rgy Earth Sci Arts &Humani ties Phy sics Computer Sci Env ironmental Sci Soc ial Sci Agricul tural Sci Mathematics Medici ne Biochemis try Pha rma cology Immunolog y Neuros cience 0% j 20% Main Home Page Journal Home Page External Link 100% 80% 60% 40% 20 Some Analysis on Subject Entry Life Science end users prefer to come into ScienceDirect from 3rd party sources, namely the A&I databases Humanities and Social Scientists prefer the Journal Home Page on ScienceDirect For some areas there appears no difference: Energy, Chemical Engineering, Mat Sciences and Engineering. 21 2. Usage of Usage Reports Internal Elsevier Study of use of ScienceDirect usage reports Who uses usage reports? Asia/Pacific Librarians lead the list What triggers use of these reports Email Alerts Asia/Pacific has the most alerts set up 63% customers use the reports when they have an alert as opposed to 30% without the alert. 22 Effect of Alerts Percentage of active accounts and percentage of accounts with alerts 90% 80% 70% 60% APAC 50% Americas 40% EMEA Total 30% 20% 10% 0% % of accounts % accounts with % accounts w/o using the reports alerts using the alerts using the reports reports % accounts with alerts 23 Popular Usage Reports Relative popularity over tim e 25.0% 20.0% 15.0% 10.0% 5.0% 0.0% 1a. FTA per 1b. FTA per entitlement journal *COUNTER* 2a. General overview 4a. Searches and sessions *COUNTER* 3a. Usage by file type 2b. Users and sessions 3d. Document usage stats 3b. Usage by entitlement 24 Feb-03 Mar-03 Apr-03 May-03 Jun-03 Jul-03 Aug-03 Sep-03 Oct-03 Nov-03 Dec-03 Jan-04 Feb-04 Mar-04 Apr-04 May-04 Jun-04 Jul-04 Aug-04 Sep-04 Oct-04 Nov-04 Dec-04 Jan-05 Number of Reports over time 16.0 1600 14.0 1400 12.0 1200 10.0 1000 8.0 800 6.0 600 4.0 400 2.0 200 0.0 0 reports per id active ids 25 Usage Reports: did we over build? ScienceDirect data may indicate such. But the data may be useful some day and probably best to keep it at hand for now. Are there more functional reports that should be developed, e.g. factor in cost of content for performance measure? 26 To really end this presentation We still do not know enough about user behavior and how that affects the numbers. We do know that the users are disparate and have different usage patterns in their respective subjects. Open question: what are meaningful numbers and in what context. Answer may be a local solution. Publishers don’t have all the answers either. 27 Thank You! d.menefee@elsevier.com