T-A-1100-1200 Monitoring and Reporting

advertisement
Monitoring and
Reporting Performance
Metrics
Improvement Path Systems – Joe Zuchora
Context, Purpose, Outcome
1) Context: Navy Medicine can benefit from monitoring and
reporting performance metrics to help drive better
decisions at all management levels.
2) Purpose: Guidance will be provided in designing and
creating dashboards, and examples of successful dashboards
used for performance monitoring will be discussed
3) Outcome: Identification of the need for dashboards and
methodology for creating and using dashboards.
FOR OFFICIAL USE ONLY
2
Agenda
•
•
•
•
•
•
•
•
•
What is a Dashboard?
Why Dashboards?
Monitoring of performance
Dashboard design
Metric selection and creation
Display of metrics
Setting targets
Management decisions
Case study
FOR OFFICIAL USE ONLY
3
What is a Dashboard?
“An easy to read, often single page, real-time user interface,
showing a graphical presentation of the current status and
historical trends of an organization’s Key Performance
Indicators”
-Peter McFadden, CEO Excel Dashboard Widgets
FOR OFFICIAL USE ONLY
4
Why Dashboards?
•
•
•
Data Rich
Data Consolidation
Information Application
•
•
•
Long term: Monitoring Performance (Pharmacy Historical Queue Time)
Short term: Display actionable, concise information (Pharmacy Current
Wait Time)
Benefits: Automated, real-time information, available to
all stakeholders
FOR OFFICIAL USE ONLY
5
Performance Monitoring
•
Need for monitoring
•
•
•
•
Progress
• Goal Definition
• Update
Thresholds
Sustainment
Performance visibility
•
•
Clear/Simple
Audience
FOR OFFICIAL USE ONLY
6
Designing a Dashboard
•
Key Questions before you get started:
• Who is the audience?
•
•
•
•
What value will the dashboard add?
•
•
Role
Workflow
Data skills
Dashboards can provide value in a lot of different ways
What type of dashboard do we need?
•
Strategic? Operations? Historical? Real-time? High level? Drill-able?
Be aware of the consumers of the information
FOR OFFICIAL USE ONLY
7
Designing a Dashboard
•
•
•
Each page or grouping of visualization needs a goal
Ask good questions
• Resist the urge to add every metric available
• “What if I told you…”
Reporting vs. Exploring
• If the metric has not been measured or produced
before, it does not belong on a dashboard
Metrics need to be well-understood and designed for the end user.
FOR OFFICIAL USE ONLY
8
Metric Selection
•
•
Key performance indicators (KPIs) allow for ongoing
measurement of a system’s performance
KPIs should be:
• Simple
• Understood
• Actionable
• Credible
The perfect metric is actionable, understood, credible, transparent and simple
to calculate
FOR OFFICIAL USE ONLY
9
Metric Creation
•
•
Requirements and design
Identify data needs
•
•
Prototype metrics with team
•
•
•
•
Feasibility?
The technology of choice will dictate the visualization tool available.
Deploy, monitor, and make revisions
Documentation!
Pitfalls: Data source changes, business process changes,
too many metrics, reporting vs. exploring
The metric creation process is iterative in nature and involves the developers,
project managers, and end users
FOR OFFICIAL USE ONLY
10
Display of Metrics
•
•
•
•
•
•
Types of charts
• Trends
• Pareto
Stoplights
Tables
Thermometers
Heat maps
Conditional Formatting
FOR OFFICIAL USE ONLY
11
Understanding Displays
•
Trend charts
FOR OFFICIAL USE ONLY
12
Understanding Displays
•
Pareto charts
Pareto with drill down
“What is the most common
reason for last minute
cancellations? How can I find
out more information?”
FOR OFFICIAL USE ONLY
13
Understanding Displays
•
Stoplights, Tables, Maps, and Conditional Formatting
FOR OFFICIAL USE ONLY
14
Setting Targets
•
•
A target should be achievable
• Moderate
• Stretch
A target should be intuitive
• Zero sentinel events
FOR OFFICIAL USE ONLY
15
Monitoring Using Targets
•
More than just meeting the target
• Variability
• Comparisons over time
Target: 20 minute
queue time or less.
FOR OFFICIAL USE ONLY
16
Management Decisions
•
Management via dashboard is enabled by:
• Tool visibility at all appropriate levels
• Regular review of dashboard metrics
• Ownership of metrics
• Identification of all metric contributors
• Focus on processes and not “gaming” the metric
Management must own the metrics – it is a top down process.
FOR OFFICIAL USE ONLY
17
Case Study
Data rich example of using tables to display information.
Audience: Technical and Analytical End User
Value: Provide a forecasted workload
Type: Specific, tactical, forecasting
FOR OFFICIAL USE ONLY
18
Case Study
Data consolidation example of using different types of displays.
Audience: Strategic, Director/CO Level
Value: Performance monitoring, create action items
Type: Specific, operational, historical and current
FOR OFFICIAL USE ONLY
19
Case Study
Data consolidation example of using different types of displays.
Audience: Tactical, Manager/Front line supervisor
Value: Performance monitoring, action items for today
Type: Specific, tactical, current
Tech
Tech
Tech
Tech
Tech
Tech
Tech
Name
Name
Name
Name
Name
Name
Name
1
2
3
4
5
6
..
FOR OFFICIAL USE ONLY
20
Questions
Joe Zuchora, Analyst
Improvement Path Systems
zuchora@improvmentpath.com
248-935-6364
FOR OFFICIAL USE ONLY
21
Download