Topic 2: Quality Metrics

advertisement
Topic 2
-
Quality Metrics
Bart Vannieuwenhuyse
Senior Director Health Information Sciences
Janssen R&D
Bart Vannieuwenhuyse
1
Brussels, 20 March 2013
Topic 2 – Quality Metrics
Scope of the project –

Purpose – “improving”

“what gets measured, gets done”

What are “Quality Metrics”?



A “metric” is a measure.
“Quality” is something a “customer” defines.
A “Quality Metric”, therefore, is a measure of quality as defined by the
customer.
NOTE 1: A “customer” might be defined as anybody with an expectation of
receiving something of value in exchange for something else of value, either
external to or internal to an organization.
NOTE 2: Not all “Metrics” are “Quality Metrics”
http://www.capatrak.com/Files/PresH%20-%20Metrics.pdf
Bart Vannieuwenhuyse
2
Brussels, 20 March 2013
Topic 2 – Quality Metrics
Contributing projects
Bart Vannieuwenhuyse
3
Brussels, 20 March 2013
Topic 2 – Quality Metrics
Convergence challenges

Define scope – agree on areas with highest need

“Internal” vs “External” application of metrics

Potential opportunities to leverage (tbd)
 Improving efficiency of collaboration in project
 Process to improve project deliverables
 Measuring quality of (external) data
 Identifying quality of (sub)contractors
Bart Vannieuwenhuyse
4
Brussels, 20 March 2013
Topic 2 – report back
• Quality Metrics – domains:
– Project quality
• Quality of deliverables – internal “peer review” generally adopted
• Project management – “on time – on budget” generally adopted
– Project impact
• Uptake of solutions – need for further development of metrics
(e.g. Service registry using text mining in BioMedBridges)
• Scientific impact – publications, possibility to further improve on
speed and breadth of sharing results
• Societal / health care impact – need for further development of
more standardized approaches
– Data quality …
Bart Vannieuwenhuyse
5
Brussels, 20 March 2013
Data Quality
“Data quality is the end product of a whole process”
Quality of
Solution
Quality of
Usage
Metrics 1
Metrics 2
“All elements need to be of the right quality”
A Rolls Royce with 3 wheels is a crappy car
Bart Vannieuwenhuyse
6
Brussels, 20 March 2013
Data quality - process
• Context of data creation – meta-data
– Should be made explicit
– Provenance must be clear
• “medical context” - clarity on reimbursement and “medical
practice”
• Clarity on who created the data
– Mapping to common ontologies
• Type of use drives selection of data
– Data should be fit for intended use – Care vs Research
– Options to select data sources on available meta data
Bart Vannieuwenhuyse
7
Brussels, 20 March 2013
Data quality - metrics
• Quality of solution – metrics 1
– Adopt existing standards e.g. ISO 25000
•
•
•
•
•
SDLC like approach (engineering)
Functional suitability
- Reliability
Performance efficiency
- Security
Compatibility
- Maintainability
Usability
- Portability
– STEEEP – Safe Timely Efficient Effective Equitable Patient-centered
(IOM – US)
• Quality of usage – metrics 2
• Effectiveness
• Efficiency
• User satisfaction
Bart Vannieuwenhuyse
- Freedom of risk
- Context coverage
8
Brussels, 20 March 2013
Data quality - dimensions
• Accuracy
• Quantitative vs Qualitative data (origin of data)
• Benchmarking to check accuracy (TransForm, OMOP, EU-ADR)
• Completeness
• Needed granularity – data available? (TransForm selection tool)
• “Longitudinality” – length of available Hx
• Timeliness
• Data “freshness” – latest update
• Reliability
• Who created the data – who is responsible
• Trustworthiness – traceability (versioning, time-stamping)
• Structured - Unstructured
Bart Vannieuwenhuyse
9
Brussels, 20 March 2013
Next steps
• Data Quality Metrics community
– Convene individuals from all EU projects dealing with re-use of existing data
– Consolidate existing approaches across EU projects – share current solution
– Classifications of data quality metrics – check availability of ISO standards for
eHealth data – if not, consider developing one? (ISO 8000 general data quality)
– Consolidate available quality standards of solutions (e.g. ISO 25000)
– Recommendation for projects to focus on data quality even before projects starts
– Develop common approaches to evaluating data quality – “benchmarking”
analogy of computer chips // radar-graph
– Have guidelines on Data quality – e.g. when creating new data / attention to
meta-data (training)
– Develop and share analytical methods that deal with “imperfect data”
Data quality is a journey
And even the longest journey starts with the first step
Bart Vannieuwenhuyse
10
Brussels, 20 March 2013
Download