PS Benchmark Review - Technology Services Industry Association

advertisement
PS Benchmark Review:
SAMPLE
Updated July, 2011
Bo Di Muccio, Ph.D.
Vice President, Professional Services
Benchmark Highlights for SAMPLE
 Overall, this exercise reveals that SAMPLE’s processes and
performance levels are more on target than off target
 Based on a core peer group comparison against other large, productcentric PS organizations (“Product Providers”), SAMPLE looks largely
as one would expect
 Especially on target benchmark results obtain for Services Engineering
and Partner Management
 Sales, Delivery, and Operations functions all compare favorably to
current industry and peer group benchmark standards
 The main area of concern is financial performance which - even
compared to very product and partner centric PS businesses –
benchmarks rather poorly
 Detailed findings and recommendations attempt to zero in on high
impact ways to aligning better to industry and peer group benchmarks
2
PS Benchmark Dashboard: SAMPLE
Reminder: This is a
benchmark comparison,
not a judgment about
whether you’re “good or
bad” at these things or
whether your practices
are “mature.” It’s simply
a comparison of your
practices and metrics to
industry and peer group
benchmarks using the
logic and methodology
previously articulated.
3
TSIA and Benchmark Methodology Overview
Technology Services Industry Association:
Why, How, What?
Why: Product + Services =
Maximum Economics
How: Community Model
What: Data, Benchmarking,
Networking, Awards and
Recognition, Advisory
“The most interesting economics in technology are now being
generated by the right combination of products and services.”
5
Professional
Services
6
TSIA Community
Sample TSIA PS Members
• 300+ TSIA
Members
• 105 PS
Members as of
7/1/2011
• 40% growth in
3 years
• This is our
benchmarking
community
7
2011 PS Research Activities
Service 50
Cloud 20
Europe 20
Public
Financial Data
Rates, Comp
Project Perform.
Solution Centers,
Partner Mgt
Service Line
Benchmarking
Topical
Studies
Cloud and PS,
Partner
Enablement
Basis for
Benchmark
Reviews
Case
Studies and
Peer
Networking
8
Core
Core PS Benchmark Study Overview
 Perpetual study since 2005
 Hundreds of PSOs benchmarked
 100+ validated completes in most
recent snapshot
 Ability to segment in multiple ways:
Company/TPSO
size
HW
versus SW
Traditional SW vs. SaaS model
Services intensiveness
PS charter
PS Performance
 Basis for:
Data
mining
Advisory/Inquiry
PS benchmark reviews
9
©2009, TPSA Inc.
Benchmark Review Logic




This is a benchmark review NOT a “maturity assessment”
No theoretical/arbitrary standards, rather how you look against others
Entries carefully validated to ensure benchmark quality data
SAMPLE benchmark compared to 2 core groups
 “Industry”
 100+ companies benchmarked and captured in Q410 snapshot
 Cross section of larger/smaller (50/50) and software/hardware (65/35)
 Virtually all technology product companies with services in the portfolio
 “Peer Group”: Product Providers
 60+ PSOs in this group
 Revenue mix and PS size make it the best current-state comparison
 Peer group is more heavily weighted in “scoring”
 Scoring is meant to highlight areas for focus or possible initiatives
 100% confidential in every way
10
Service Strategy Profiles
Service
Strategy Type
Role of PS
Avg PS Revenue
Contribution
Solution
Provider
Solution completion
Without PS capability, immature
product capability never matures
50%
Product
Extender
Account extension
PS extends service footprint into existing install
base to secure product renewal
20%
Product
Provider
Product enablement
PS accelerates adoption of maturing product
Product/Customer success critical
6%
11
Key Goals of Service Strategy Profile Approach
Help PSOs Understand True Priorities
Help Align Executive Management Team








Charter of PS
Financial Business Model
Growth Targets
Type of Service Offerings
Service Sales Strategy
Core PS Skills
Service Partner Strategy
Scalability Model
12
Benchmarking Practices: Comparison 1
In Alignment?
Peer
Group
Majority
Practice
Required
Practice
Your
Practice
Industry
Majority
Practice
13
Comparison 2: Practice “Zones”
Required
Practice
Common
Peer or
Pace Setter
Practice
Common
Industry
Practice
Differentiated
Practice
Uncommon
Good or
Pace Setter
Practice
Practices are business processes that help companies achieve target results.
14
Metrics/Results Zones
Metrics are analytical
measurements intended to
quantify the state of a
business. They are
independent variables or
leading indicators for the
performance of a PS
business. Examples are
attach rates, billable
utilization, and project
duration
Results are analytical
measurements intended to
quantify the state of a
business. These are
dependent variables or
lagging indicators for the
performance of a PS
business. Results are
produced by practices and
metrics. Examples are
project margin, field margin
and net operating income.
+/- 10% of Avg
“On Target”
Average
Off Target
(High)
Off Target
(Low)
Important Exceptions
•
•
Results or metrics that are off target relative to both the industry and peers, but in a clearly positive way are
rated as “differentiated” (Ex: extremely high project margins)
Key metrics or results that are off target relative to both the industry and peers and in a clearly negative way
are rated as “critical off target” (extremely low project margins)
15
Evaluation Framework







Each self-reported practice was compared to the majority practices of the industry
and the target peer group
On target industry practices were worth 3 points; on target peer group practices
were worth 5 points
Each self-reported metric and result was compared to the average metrics and
results of the industry and the target peer group
On target or better industry metrics and results were worth 3 points; on target peer
metrics and results were worth 5 points
Differentiated practices/metrics: points taken off the table
Missing of off target critical practices/metrics: points doubled
Scores are expressed as % of total possible points and assigned color coding as
follows:




0% - 24%: RED
25% - 49%: ORANGE
50% - 74%: OLIVE
75% - 100%: GREEN
16
Putting It All Together: An Example
Practice or Metric
Missing or Off Target
Overall
Rating
(Points divided by total possible)
(Possible points unchanged)
CRITICAL Practice or
Metric Missing or Off
Target
Differentiated
Metric or Practice
(Points taken off the table)
(Possible points doubled)
17
PS Engine
Customer
Experience
Product
Impact
Partners &
Sourcing
18
PS Benchmark Dashboard: SAMPLE
Reminder: This is a
benchmark comparison,
not a judgment about
whether you’re “good or
bad” at these things or
whether your practices
are “mature.” It’s simply
a comparison of your
practices and metrics to
industry and peer group
benchmarks using the
logic and methodology
previously articulated.
19
TPS Business Model
Business Model: PS Pace Setters
Observation
The top 20% of PS
organizations are very
profitable, but the bottom
80% are just north of
break-even
Yet, revenue mix for
these two groups is not
very different.
For the vast majority of
PSOs, this data explodes
the 20% Net OI “myth.”
Source: TSIA PS Benchmark Data, Q1 2011
21
PS Business Model
Observations
Pretty close fit for a
classic Product Extender
Very heavy emphasis on
product over service in
the revenue mix
PS revenue contribution
precisely on the peer
group average
22
PS Business Model
Observation
Dashboard shows that
SAMPLE is truly a classic
Product Provider in
virtually every way
Exception is in financial
performance
Even Product Providers
do better on project
margin
Field costs are about
average
So pressure on field
margin comes from low
project margins
23
PS Business Model
Observation
Total below the line OpEx
is pretty much on target
The one potential miss is
in the fact that SAMPLE
is spending 2X the
average on sales
G&A spending is ½ the
industry average. Is this
high efficiency or lack of
critical investment?
24
PS Business Model
Observation
Combination of low project
margins and moderately
high costs means that
SAMPLE’s business model
looks very different from the
industry and from the
Product Provider peer
group
Cost Center model was
more common 10 years ago
It’s actually very uncommon
now
Doesn’t mean it isn’t the
right model for you
25
PS Business Model
Observation
Same data, different view
This view highlight the
consequences of lower
project margins coupled
with higher overall costs
26
Overall Observations and
Recommendations
PS Benchmark Dashboard: SAMPLE
Reminder: This is a
benchmark comparison,
not a judgment about
whether you’re “good or
bad” at these things or
whether your practices
are “mature.” It’s simply
a comparison of your
practices and metrics to
industry and peer group
benchmarks using the
logic and methodology
previously articulated.
28
Delivery
Delivery in the green, but not by much:
•
•
•
•
•
Delivery staff spending a lot of time on presales, which likely contributes to higher sales costs
Revenue metrics are off the charts high, so that’s not the problem
However, rate realization and discount rates (from Sales Module) are huge misses
What’s puzzling is that the actual rates are squarely in range, indicating rate card is skewed
Most companies have far higher involvement of direct FTE project managers
29
Business Model
The other of only two ratings areas NOT in the green:
•
•
•
•
•
•
If this business models is documented in the strategy and represents a deliberate decision to run PS
as a cost center, this has been achieved
The message is that even Product Providers have moved to a profit center model for PS, including
ones that have a heavily partner centric model
A 5% to 10% increase in project margins would move SAMPLE squarely into the expected profile for
Product Providers
This will be difficult as long as PS reports to Sales, which likely is only concerned about product
margins
Question: Do you have a “seat at the table” with this economic model in place?
Would you have a better seat if you could demonstrate good “services hygiene?”
30
TPSO Ability to Change
Framework for Prioritizing Initiatives
Lower Impact
Easier to Do
Higher Impact
Easier to Do
Lower Impact
Higher Impact
Harder to Do
Harder to Do
Positive Impact on PS Financial Performance
31
TPSO Ability to Change
Priority Initiatives for SAMPLE
Consider Formal Project C-Sat
Program
Consider implementing PMO
Practice
Validate Product Provider
Model as Foundation of
Service Strategy
Review Rate Card and
Discounting Policy
Review Project Margin
Performance
Solution Center
Overall Pyramid/Labor Costs
Review and Validate
Sales Model
Positive Impact on PS Financial Performance
32
Appendix:
Additional Function Detail
Total Score
34
Practice Zones: SAMPLE
Missing/Off Target Practices
•
•
•
•
•
•
Differentiated Practices
PSE reports to Sales Exec
PSO w/o final pricing authority
Sales not comp’d on PS bookings
No project C-Sat program
Partners have primary customer rel.
Resourcing done locally only
Peer
Practice
Industry
Practice
•
•
•
•
•
•
Documented PS Strategy
Formal sales methodology
Formal skills assessment
Defined career paths
Svcs Engin releases new PS offers
Formal partner certification
Reasonable
Practice
Good balance of missing/off target and
differentiated practices
35
Key Metrics and Results: SAMPLE
Below Target
•Project margin
•Field gross margin
•Net operating income
•Attach rate
•PS- product revenue ratio
•Proposal hit ratio
•Days quote to closure
•% projects with C-Sat survey
•Rate realization
•% delivery corp PS direct
Severely Lagging
•% delivery remote direct
Results
•Avg project duration
•% projects with PM
•Avg days onboard to billable
•Avg days to source new PS
•Engineering target utilization
•Mktg spend: awareness
•Mktg spend; demand gen
On Target
Above Target
•PS revenue contribution
•PS EPS contribution
•Field costs
•PS 1 yr growth
•Project size
•Target utilization
•Actual utilization
•Realized rate ($)
Lagging
On PS
Target
•% delivery local
direct
•%
delivery
global
partners
Results Results
•Voluntary attrition
•% projects with post-mortem
•G&A costs
•% delivery time on training
•New PS concept to delivery
•Marketing costs
•Margin on sub resources
•Total OpEx
•Total costs
•PS 3 yr growth
•Sales costs
•Avg rate discount
•% engagements Fixed/NTE
•Delivery % time on presales
•Revenue per consultant
•Revenue per total PS HC
•% delivery subcontracted
•% delivery local partners
•Engineering costs
•% offers repeatable packages
•Mktg spend: content
•Mktg spend: analysis
Relative balance of below target, on target and
above target metrics and results
36
Overall Result vs. Recent Reviews
P
r
a
c
t
i
c
e
s
vs. Peers
vs. Industry
R
e
s
u
l
t
s
SAMPLE
37
Practices and Results on Target by Functional
Area: Business Model
38
Practices and Results on Target by Functional
Area: Sales and CRM
39
Practices and Results on Target by Functional
Area: Delivery
40
Practices and Results on Target by Functional
Area: Operations
41
Practices and Results on Target by Functional
Area: Engineering
42
Practices and Results on Target by Functional
Area: Marketing
43
Practices and Results on Target by Functional
Area: Partner Management
44
Questions?
Bo Di Muccio, Ph.D
Vice President, Professional Services
Technology Services Industry Association
Office: 724-946-9798
Mobile: 724-877-0062
E-mail: bo.dimuccio@tsia.com
Blog: DiMuccio's DataViews Blog
Website: www.tsia.com
Download