slides - Altmetrics

advertisement
The NISO Altmetrics Initiative
Altmetrics14
Bloomington, IN
Nettie Lagace (@abugseye)
NISO Associate Director for Programs
What’s NISO?
• Non-profit industry trade association accredited by ANSI
with 150+ members
• Mission of developing and maintaining standards related to
information, documentation, discovery and distribution of
published materials and media
• Represent US interests to ISO TC46 (Information and
Documentation) and also serve as Secretariat for ISO
TC46/SC 9 (Identification and Description)
• Responsible for standards like ISSN, DOI, Dublin Core
metadata, DAISY digital talking books, OpenURL, SIP, NCIP,
MARC records and ISBN (indirectly)
• Volunteer driven organization: 400+ spread out across the
world
Premise of “Standards”
• Consensus standards created by a community with
various stakeholders
• Trust
• Leading to broader acceptance
•
•
•
•
Standards as plumbing
Standards facilitate trade, commerce and innovation
Standards reduce costs
Standards support better communication and
interoperability across systems
Definitions and Principles
Standard
Recommended Practice
OA
MI
Balance: no single interest category constitutes a majority of
the membership / voting pool / working group: producer, user,
general interest
Consensus: respond to all comments and make efforts to resolve
negative votes, even if proposal is approved
Open process: allowing members and the community to
have confidence in NISO standards.
4
Why worth funding?
• Scholarly assessment is critical to the overall
process
– Which projects get funded
– Who gets promoted and tenure
– Which publications are prominent
• Assessment has been based on citation since the
60s
• Today’s scholars multiple types of interactions
with scholarly content are not reflected
– Is “non-traditional” scholarly output important too?
Why worth funding?
• In order to move out of “pilot” and “proof-ofconcept” phases …
• Altmetrics must coalesce around commonly
understood definitions, calculations and data
sharing practices
• Altmetrics must be able to be audited
• Organizations who want to apply metrics will
need to understand them and ensure consistent
application and meaning across the community
2 Phases – Phase 1
• Describe the current state of the altmetrics discussion
• Identify potential action items for further work on best
practices and standards
• Hold meetings of stakeholders to define a high-level list
of issues
–
–
–
–
–
October 2013, San Francisco
December 2013, Washington, DC
January 2014, Philadelphia
Public Webinars
White paper output, public presentations, public feedback
2 Phases - Phase 2
• Phase 2: Create Working Group within NISO
structure, to create recommended practice(s)
and/or standard(s)
– Education/training efforts to ensure
implementation
• Final report to Sloan due November 2015
Steering Committee
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Euan Adie, Altmetric
Amy Brand, Harvard University
Mike Buschman, Plum Analytics
Todd Carpenter, NISO
Martin Fenner, Public Library of Science (PLoS) (Chair)
Michael Habib, Reed Elsevier
Gregg Gordon, Social Science Research Network (SSRN)
William Gunn, Mendeley
Nettie Lagace, NISO
Jamie Liu, American Chemical Society (ACS)
Heather Piwowar, ImpactStory
John Sack, HighWire Press
Peter Shepherd, Project Counter
Christine Stohn, Ex Libris
Greg Tananbaum, SPARC (Scholarly Publishing & Academic Resources Coalition)
Approach
• Open format: lightning talks, brainstorming,
breakout groups, etc.
• Include all stakeholders
• Focus on collecting unstructured input
• Make all material (including audio recordings of
steering group) publicly available
Meeting Lightning Talks
• Expectations of researchers
• Exploring disciplinary differences in the use of social media in
scholarly communication
• Altmetrics as part of the services of a large university library
system
• Deriving altmetrics from annotation activity
• Altmetrics for Institutional Repositories: Are the metadata
ready?
• Snowball Metrics: Global Standards for Institutional
Benchmarking
• International Standard Name Identifier
• Altmetric.com, Plum Analytics, Mendeley reader survey
• Twitter Inconsistency
“Lightning" by snowpeak is licensed under CC BY 2.0
Discussions
San Francisco
Washington DC
Philadelphia
Business & Use Cases
Business & Use Cases
Use Cases (3X)
Quality & Data science
Qualitative vs. Quantitative
Data Integrity
Definitions
Definitions/Defining
Impact
Definitions
Development &
Infrastructure
Identifying Stakeholders
and their Values
Standards
Future Proofing
Interviews
• Thirty researchers, administrators, librarians,
funders (and others)
• Semi-structured interview
• March – April 2014
1.
Develop specific definitions for alternative assessment
metrics.
2. Agree on using the term altmetrics, or on using a
different term.
3. Define subcategories for alternative assessment metrics,
as needed.
4. Identify research output types that are applicable to the
use of metrics.
5. Define appropriate metrics for specific output types.
6. Agree on main use cases for alternative assessment
metrics.
7. Develop statement about role of alternative assessment
metrics in research evaluation.
8. Identify specific scenarios for the use of altmetrics in
research evaluation (e.g., research data, social impact).
9. Promote and facilitate use of persistent identifiers.
10. Research reproducibility issues.
11. Develop strategies to improve data quality.
"Q is for Question Mark" by b4b2 is licensed under CC BY 2.0
12. Develop strategies to increase trust, e.g., openly
available data, audits, or a clearinghouse.
13. Identify best practices for grouping and aggregating
multiple data sources.
14. Identify best practices for grouping and aggregation by
journal, author, institution, and funder.
15. Define contributorship roles.
16. Establish context and normalization over time, by
discipline and country.
17. Describe main use cases for the different stakeholder
groups.
18. Identify best practices for identifying contributor
categories (e.g., scholars vs. general public).
19. Identify organizations to include in further discussions.
20. Identify existing standards to include in further
discussions.
21. Prioritize further activities.
22. Clarify researcher strategy (e.g., driven by researcher
uptake vs. mandates by funders and institutions).
"Q is for Question Mark" by b4b2 is licensed under CC BY 2.0
http://www.niso.org
Next steps
• Finalize and release white paper and draft
new work item proposal for standards/best
practices based on the study
• Proposal vetted by NISO leadership and
members
• Proposal approved and working groups
formed for Phase II of the project
Thank you!
Questions?
Nettie Lagace
nlagace@niso.org
@abugseye
Download