Innovative Capability Audits of University Research Centers

advertisement
Nystrom, H. "Innovative Capability Audits of University Research Centers," Proceedings - 9th International Conference on
Management of Technology, Miami, FL, Feb. 2 2000, CD-ROM.
Innovative Capability Audits of University Research Centers
Halvard E. Nystrom
Engineering Management
University of Missouri-Rolla
Abstract: An assessment of current conditions is the first logical step in the development of an
effective technology management program. If an organization is to prioritize resources to invest in
the areas of development that are most critical, it should first assess its current condition with
some form of technology audit. However, the literature is severely lacking on practical research
that provides methods to apply these audits in research organizations. This paper reviews the
findings in the literature on technology audits and other assessment tools as a first step in the
development of a technology management program for a university research center. It also
describes a method developed, the motivating factors for that design, and results obtained in a case
study. The methodology is presented as a practical approach to generate this assessment.
Keywords: Innovative capability audit, university research center, technology audit, innovation.
Introduction
Here is a great situation for a research professional involved in technology management. A
new research center is created in your campus and the new center director agrees that technology
management is critical for success and asks for help. What activities should be performed in order
to provide technological leadership and enable the center to become more successful? There are
so many things to do, but which do you do first? This article relates the actual events that occurred
with the Center for Infrastructure Engineering Studies (CIES) at the University of Missouri-Rolla
and documents the efforts to answer these questions. This Center is chartered to foster research in
the application of advanced composites on structures such as bridges and buildings.
Klein (1995) identified four key elements in the management of intellectual capital that
provide some direction. These are: 1) understanding the relevant strategic and operational roles in
the organization, what is needed today and tomorrow; 2) creating an infrastructure for cultivating
and sharing it; 3) creating the culture that encourages it; and 4) monitoring, valuing and reporting it.
The question is how to do it and how to start. Effective assessment of the organization's current
condition advances all of these elements.
Literature Search
Searching the literature for usable methods to assess a research center was disappointing.
In the 60's there was considerable research done on technology audits, but it focused on
macroeconomic analysis and looked at total markets instead of individual organizations.
Numerous consultants currently provide technology assessment services and perform internal
research program audits (Oxford Innovation 1998, FRD 1998, Technology Transfer Group 1998,
Forbairt 1998, Forthright Innovation 1998) and have documented their efforts with major
corporations such as Daimler-Benz (Blau 1995), 3M (Blau 1996), Eaton (Jaskolski 1996), Nalco
Chemical (Keiser & Blake 1996), and Westinghouse (Foster 1996). Some of the articles
1
Nystrom, H. "Innovative Capability Audits of University Research Centers," Proceedings - 9th International Conference on
Management of Technology, Miami, FL, Feb. 2 2000, CD-ROM.
explained processed that required significant historical information (Roberts 1998 and Tipping,
Zeffren & Fusfeld 1995). However, these articles provided little guidance on how to do it.
Assessment of the center technologies to allow for technology forecasts and technology
roadmaps originally seemed to be an attractive approach. However, as methods were investigated
it became clear that effective measurements of technologies would be extremely difficult to design
and implement. Chester (1995) discussed some of the problems with today's R&D measurement
tools and the reasons why they are mistrusted by many CEO's and business unit leaders. In our
application we had limited resources and historical information, which made the situation even
more constrained. In addition it became clear that detailed assessments along with the forecasts
and roadmaps often become instruments for management control that stifle creativity in the
research efforts. As a result some research organizations make explicit efforts to avoid detailed
assessments to avoid the bureaucracy that stifles the creativity that is so critical to R&D success
(Perry 1995a and 1995b).
As a result, assessment methods were sought that could be used as a diagnostic tool to
assess the center's strengths and weaknesses. The focus shifted to the conditions available to
empower the research efforts instead of the specific technologies being utilized. Burgelman and
Maidique (1988) presented the Innovative Capabilities Audit Framework, which included a
business level audit that addressed these needs. They defined innovative capabilities as "the
comprehensive set of characteristics of an organization that facilitate and support its innovation
strategies"(p. 36). The five categories of variables that influence innovation strategies for a
business unit were identified as:
1.
2.
3.
4.
5.
Resources available for innovative activities.
Capacity to understand competitor innovative strategies.
Capacity to understand technological developments.
Structural and cultural context of the organization affecting entrepreneurial behavior.
Management capacity to deal with entrepreneurial activities.
Audit Instrument
From this framework, the innovative capability audit instrument was developed to assess
the center's specific innovative capabilities, and the audit form is shown in Figure 1. These
variables were assessed by the following nine audit criteria that address the center's resources,
strategy formulation and implementation:
Resources
1. Equipment and labs: the physical resources available to perform each of the phases of
innovation.
2. Personnel: access to sufficient personnel with adequate knowledge and experience to
perform the necessary tasks.
3. Access to information: capability to find needed information.
2
Nystrom, H. "Innovative Capability Audits of University Research Centers," Proceedings - 9th International Conference on
Management of Technology, Miami, FL, Feb. 2 2000, CD-ROM.
Strategy Formulation
4. Internal strengths: capability to formulate our own technology strategy based on the
knowledge and experience of our own team members. This is a measure of the Center's
intellectual capital.
5. Awareness of events: access to external information that might have implications to the
formulation of Center technology strategy.
6. Recognize importance: capability to recognize the impact external information should
have to our technology strategy, assuming awareness and access to that information.
Implementation
7. Organization: is the organizational structure set up to help or hinder the performance of
the tasks in each phase? Organizational structures facilitate the interactions that control
communications, decision making, and goal setting within an organization.
8. Culture: culture is the set of predominant attitudes within an organization that has a strong
influence on behavior and performance of the members. Are the attitudes and behavioral
traditions of the team well suited to each phase of innovation?
9. Communication: are the formal and informal communication methods adequate to share
important information in a timely manner?
These assessment criteria were used for five phases of the innovation process. Within the
spectrum of activity from basic research to applied research and development, CIES focuses
heavily on activities in applied research and development. Therefore the phases of innovation
were customized for this specific center so that the descriptions of the phases would include most
of the center's activities.
These phases were:
•
•
•
•
•
Needs Analysis: the activities associated with the acquisition of data, information and
references that help determine the need for specific research projects.
Discovery: the activities associated with the creation of new ideas, concepts or methods that
would be useful to the overall project.
Process Development – the activities related to taking existing ideas, concepts and methods
and turning them into a process that can be applied commercially.
Process Validation – the testing, recording and analysis functions that validates the process
being developed and characterizes the expected performance.
Technology Transfer – the communication activities to: 1) enable the industrial partner to use
the process developed; 2) create effective specifications and standards for those processes;
and 3) influence industrial standards setting committees to accept them.
The audit was designed for administration as a survey in order to standardize the responses
and facilitate the analysis. All the center members were asked to anonymously participate
including the center director, faculty, students and staff. However, the responses were segregated
by respondent type to help interpret the results. They were given instructions that described all the
3
Nystrom, H. "Innovative Capability Audits of University Research Centers," Proceedings - 9th International Conference on
Management of Technology, Miami, FL, Feb. 2 2000, CD-ROM.
terms, concepts and processes and individual assistance was offered to any respondent who
desired it. The respondents were asked to evaluate the center in the five phases of innovation using
the nine audit criteria, with the following instruction, "Compared to a realistic and ideal condition,
how would you assess the comparative condition of the Center, using the following scale:"
1.
2.
3.
4.
5.
Excellent, or "the best",
Good, or "a leader",
Adequate, or "average",
Lacking, or "a follower", or
Needs improvement, or "just starting".
The audit also sought to determine the emphasis given to each of the phases of innovation
to assess the balance of the center efforts among the five phases. The respondent was asked to
allocate the 100% of center effort to the five phases. In addition the participants were asked to
provide open responses to the following questions:
•
•
•
•
•
•
•
Who was your benchmark for comparison?
What is our greatest strength?
What is our greatest barrier to success?
Which are our key technologies now?
Which will be our key technologies in 5 years?
Comment on any ratings that indicate very good or very poor results in your assessment.
Any other comments?
For some of the center projects, there were industrial partners funding those projects. We
contacted the industrial partners via e-mail and asked them to document their opinion of what they
would like the center's emphasis to be among the five innovation phases. The objective was to
identify if there was a significant mismatch in effort within the phases. They were asked to allocate
the 100% of center effort to the five phases as they would like to see it.
4
Nystrom, H. "Innovative Capability Audits of University Research Centers," Proceedings - 9th International Conference on Management of Technology, Miami, FL, Feb. 2 2000, CDROM.
FIGURE 1
INNOVATIVE CAPABILITY AUDIT (CENTER)
Date:
participation: role:
NSF RB2C __
center
dir. ___
INSTRUCTIONS: To evaluate the phases of innovation using the criteria below (except for “EMPHASIS” ,
which is in percentage), apply the following scale. Compared to a realistic and ideal condition, how would you
assess the comparative condition of the Center.
1 = excellent
2 = good
3 = adequate
4 = lacking
5 = needs improvement
the best
leader
average
follower
just starting
PHASES OF THE INNOVATION PROCESS
NEEDS
DISCOVERY
ANALYSIS
PHASE
EMPHASIS (%)
_____
_____
RESOURCES
Equipment & labs
Personnel
Access to information
_____
_____
_____
_____
_____
PROCESS
DEVELOPMENT
_____
_____
_____
_____
UTC:
PROCESS
VALIDATION
_____
_
faculty ___
student ___
Staff
___
other _____
TECHNOLOGY
TRANSFER
_____
_____
_____
_____
_
_____
_____
_____
_____
STRATEGY FORMULATION
Internal strengths
(Intellectual capital)
External forces
Aware of events
Recognize importance
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
IMPLEMENTATION
Organization
Culture
Communication
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
GENERAL QUESTIONS
Who was your benchmark: __________________________________ Which are our key technologies now? ___________________________
What is our greatest strength? ________________________________ ___________________________________________________________
_______________________________________________________ ___________________________________________________________
What is our greatest barrier to success? _________________________ Which will be our key technologies in 5 years? ____________________
____________________________________________________________________________________________________________________
____________________________________________________________________________________________________________________
5
Nystrom, H. "Innovative Capability Audits of University Research Centers," Proceedings - 9th International Conference on
Management of Technology, Miami, FL, Feb. 2 2000, CD-ROM.
Results
During the spring of 1999, the audit was conducted with the center participants. Twentyone responses were received from 12 students, 5 faculty and 4 staff including the center director
and assistant director. This accounted for the vast majority of the individuals that were heavily
involved in center activities at the time.
The objective of this exercise was to identify the innovative strengths and weaknesses of
the Center and the Center survey results are presented in Figure 2. It is provided in graphical form
to quickly provide a graphical representation of the Center's condition. White boxes represent the
top scores while the dark boxes represent the lowest scores. It identifies the Center's major
strengths as strategy formulation particularly in the “internal strengths” category. This measurement
of intellectual capital reflects the Center's strong capability to formulate effective technology
strategies based primarily on the knowledge and experience of the team’s members. This is shown
in the figure by the number of white boxes in that row, and the overall score of 2.08, which is by
far the lowest score, reflecting the greatest strength. The mean score for all the assessment criteria
was 2.36 and the standard deviation was 0.19. In this study a score of 1 represented excellent
processes compared to the best, while a five represented significant room for improvement. The
participants were asked to make comments on the Center’s greatest strengths and the most common
comments focused on teamwork, technical knowledge, enthusiasm, excellence in ideas and rapid
response.
The survey also identified laboratory resources and communication as the greatest barriers
to effective action. This reflected dissatisfaction with the physical resources and laboratory
services available to the teams. In addition, the formal and informal communication did not
adequately provide important information to the team members in a timely manner. The comments
on the Center’s weaknesses supported these assessments. The most common comments discussed
lack of communications and insufficient students and faculty researchers.
The survey asked the participants to identify the strengths and weaknesses in relations to
the various phases of the innovation process. The phase that exhibited the greatest strength was
process validation, and the greatest weakness was in process development. However, the results
did not vary significantly among the different phases. As shown in Figure 2, the overall scores for
the phases varied from 2.31 to 2.45, with a standard deviation of 0.05. The responses from the
participants were also segregated by respondent type: faculty, students and staff, and the results are
shown in TABLE 1. (The staff includes the secretaries, the center director and assistant director.)
The students had the most positive results with an overall score of 2.24 and scores of 1.84 for
internal strengths and 2.07 for awareness of technologically important events. The students saw
the major barriers in the lack of resources: in equipment and laboratories with a score of 2.57, and
lack of personnel with 2.47. The staff, were the most critical with an overall score of 2.77. The
most critical areas were communication with a score of 3.7 and lack of equipment and laboratories
with 3.15. With scores of 2.35, the strongest areas were the ability to recognize the importance of
external technical news and implementation culture.
6
Nystrom, H. "Innovative Capability Audits of University Research Centers," Proceedings - 9th International Conference on Management of Technology, Miami, FL, Feb. 2 2000, CDROM.
FIGURE 2
INNOVATIVE CAPABILITY AUDIT (CENTER)
SURVEY RESULTS, May 4, 1999
SCALE & SYMBOLS:
Scale: 1 = excellent, 2 = good, 3 = adequate, 4 = lacking, 5 = needs improvement.
Ratings grouped by relative scores: top group , 2nd group , 3rd & 4th groups
, 5th group
NEEDS
DISCOVERY
PROCESS
PROCESS
PHASE __ DEVELOPMENT
VALIDATION
ANALYSIS
RESOURCES
Equipment & labs
Personnel
Access to information
Mean scores:
2.48
2.47
2.63
2.40
STRATEGY FORMULATION
Internal strengths
(Intellectual capital)
External forces
Aware of events
Recognize importance
Mean scores:
2.10
2.26
2.31
2.19
IMPLEMENTATION
Organization
Culture
Communication
Mean scores:
2.42
2.47
2.43
4.41
OVERALL SCORES
2.32
2.39
2.45
2.31
Participants: 21
, 6th group
TECH
TRANSFER
2.41
OVERALL
2.66
2.49
2.30
2.48
2.08
2.19
2.22
2.20
2.21
2.49
2.34
2.44
2.25
2.64
2.44
2.36
GENERAL
Who was your benchmark: UCSD, ISC, ISIS, VPI
Which are our key technologies now? FRP sheets, products & applications,
advanced materials for concrete structures, non-destructive
testing,
What is our greatest strength? Teamwork, technical knowledge,
smart sensing, rapid load testing, composite materials
enthusiasm, excellent ideas, rapid response
What is our greatest weakness? Internal communications,
Which will be our key technologies in 5 years? FRP strengthening applications,
Lack of students and faculty,
new products & systems with high material properties, load testing,
lack of faculty participation except for Center Director
manufacturing and sensing, energy dissipation, seismic
7
Nystrom, H. "Innovative Capability Audits of University Research Centers," Proceedings - 9th International Conference on
Management of Technology, Miami, FL, Feb. 2 2000, CD-ROM.
TABLE 1
Survey Results by Respondent Type
Faculty
RESOURCES
Equipment & labs
Personnel
Access to information
Students
Staff
Total Center
2.60
2.57
2.43
2.57
2.47
2.20
3.15
2.75
2.60
2.66
2.49
2.30
2.23
1.84
2.75
2.08
2.23
2.23
2.07
2.11
2.60
2.35
2.22
2.20
IMPLEMENTATION
Organization
Culture
Communication
2.43
2.43
2.60
2.37
2.21
2.32
2.65
2.35
3.70
2.44
2.25
2.64
OVERALL
2.42
2.24
2.77
2.36
STRATEGY FORMULATION
Internal strengths
(Intellectual capital)
External forces
Aware of events
Recognize importance
Faculty responses averaged between the two other groups with an overall score of 2.42. The
strengths were in all the areas of strategy formulation with a score of 2.23. The major barriers
were in lack of equipment and laboratories and communications, both with a score of 2.6
In addition to the survey utilized by the Center participants, 19 customers were asked via
e-mail to estimate the ideal emphasis by CIES within the five phases of the innovation cycle. The
objective was to determine if the Center’s emphasis was aligned to their customer expectations.
Six of these customers responded with the results as shown in TABLE 2. The Center participants
emphasized process validation and needs analysis. On the other hand the customers expected an
emphasis on technology transfer and process validation. The largest discrepancy was in the
emphasis on technology transfer, which showed that it would be valuable for each of the project
teams to directly communicate with their customers and explicitly discuss the type and level of
technology transfer that will be utilized to avoid unrealized expectations. However, it is to be
expected that there be some discrepancy in this area. Most of the projects are in the early stages of
innovation and have not given much effort to implement effective technology transfer, generating
relatively low emphasis on technology transfer. On the other hand the customers’ objective is to
gain the benefit from the technology transfer and would rank this as the most important. Therefore,
the emphasis on technology transfer does not necessarily reflect a major misalignment of effort, but
a potential problems area if it is not addressed.
8
Nystrom, H. "Innovative Capability Audits of University Research Centers," Proceedings - 9th International Conference on
Management of Technology, Miami, FL, Feb. 2 2000, CD-ROM.
TABLE 2
Center Emphasis Results
Needs
Analysis
Center
Customers
Comparison
24%
17%
7%
Discovery
Phase
Process
Development
13%
18%
-5%
19%
16%
3%
Process
Validation
27%
23%
4%
Technology
Transfer
17%
26%
-9%
Application of the Audit
The main utilization of the audit results was to provide information for a planned Center
self-assessment session (Nystrom & Attassery 1999). The session provided a forum for the
participants to review the results of the survey, generate effective dialogue and reflect on the
implications of the Center’s strengths, and barriers to effective performance. Then the focus
shifted to the future, brainstorming ideas and building consensus on the importance of various
improvement opportunities. The audit was extremely useful in this session. It focused the attention
of the team members on those areas that they themselves had identified as lacking. The graphical
presentation, in color, quickly showed the condition of the center and the results segregated by the
type of respondents clarified the different perspectives of the participants. This information
provided the session participants with issues to discuss in a non-threatening or confrontational
way. These discussions led to many useful recommendations and a very valuable interaction as
measured by another survey given at the end of the session.
Conclusion
The objective of this paper was to create a first step towards effective technology
management, and the method presented was very useful in the case study. However, it would be
valuable to reflect on the reasons why it was effective so that other assessment efforts might use it.
It provided useful information that can be used in the self-assessment session to generate clear
direction for improvement. It provided a learning and team building opportunity for the Center, in
which the members were forced to reflect on what worked well or poorly for the Center. They then
were able to compare their own views with those of the other members and discuss those
differences during the session. It did not require external subject matter experts to provide the
assessment, so it can be performed with limited resources. The results were quickly available and
easily explained, since the process was relatively simple and did not make use of complex
analytical methods. It allowed for customization for the specific circumstances of a center. It
provided an opportunity for the center participants and leaders to reflect on the performance of the
center and the activities that could make it more effective, and built team commitment to those
activities. Finally, it can be repeated in the future to assess ongoing trends in these areas.
9
Nystrom, H. "Innovative Capability Audits of University Research Centers," Proceedings - 9th International Conference on
Management of Technology, Miami, FL, Feb. 2 2000, CD-ROM.
References
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Blau, J. (1995). Daimeler-Benz Research Centralizes for Value. Research Technology
Management, 38:1, 6-7.
Blau, J. (1996) 3M Extending Audit Program to Its Technical Service. Research
Technology Management, 39:3, 3-4.
Burgelman, R. A. and M. A. Maidique (1988). Strategic Management of Technology and
Innovation. Irwin, Homewood, Illinois.
Chester, A. N. (1995). Measurements and Incentives for Central Research. Research
Technology Management, 38:4, 14-22.
Forthright Innovation (1998). Technology Audits. www.stir.ac.uk/innovation_park/fi.html.
Foster, T. M.(1996). Making R%D More Effective at Westinghouse. Research
Technology Management, 39:1, 31-37.
Forbairt (1998). National Technology Audit Programme. www.netc.ie.ntap.html.
FRD
(1998).
National
Research
and
Technology
Audit.
www.frd.ac.za/frdnews/may96/audit.
Jaskolski, S. V. (1996). New Role for R&D: The Challenge of Growth. Research
Technology Management, 39:6, 13-18.
Keiser, B. A. and N. R. Blake (1996). How Nalco Revitalized Its Quality Process for
R&D. Research Technology Management, 39:3, 23-29.
Klein, D. A. (1995). The Strategic Management of Intellectual Capital. ButterworthHeinemann, Boston.
Nystrom, H. E. and N. Attassery (1999) Technology Management at University Research
Centers. Proceedings of the American Society for Engineering Management 1999
National Conference, 334-337.
Oxford Innovation (1998). Technology Audit and Assessment Services.
www.oxtrust.org.uk/oi/audit.htm.
Perry, T. S. (1995a). Designing a Culture of Creativity. Research Technology
Management, 38:2, 14-17.
Perry, T. S. (1995b). Managed Chaos Allows More Creativity. Research Technology
Management, 38:5, 14-17.
Roberts, B. (1998). The New R&D. Electronic Business, November, 69-74.
Technology
Transfer
Group
(1998).
Technology
Audit.
www.wda.co.uk/business/techtran/audit.
Tipping, J. W., E. Zeffren and A. R. Fusfeld (1995). Assessing the Value of Your
Technology. Research Technology Management, 38:5, 22-39.
10
Nystrom, H. "Innovative Capability Audits of University Research Centers," Proceedings - 9th International Conference on
Management of Technology, Miami, FL, Feb. 2 2000, CD-ROM.
Title: Innovative Capability Audits Of University Research Centers
Halvard E. Nystrom
Engineering Management
University of Missouri-Rolla
Rolla, MO 65409-0370, USA
Halvard E. Nystrom
Dr. Nystrom is an Assistant Professor of Engineering Management at the University of
Missouri - Rolla. He obtained his B.S. in Mechanical Engineering at the University of Illinois,
Urbana, his MBA from Stanford, and his Ph.D. in Industrial Engineering with an emphasis in MOT
from Arizona State University. His research interests are in technology management, finance,
distance education and marketing. He has extensive industrial experience with Digital Equipment
Corp., Castle and Cooke Inc. in Ecuador, and in the Westinghouse R&D Center. He is the NSF
evaluator for the Industry/ University Cooperative Research Center for the Repair of Buildings and
Bridges with Composites.
11
Download