Project Name

advertisement
Project Name
Document Version 1.0
Prepared by Jane Doe, ITS
Last Edited July 23, 2016
PLAN
REQUIREMENTS
SOLUTION
ANALYSIS
DESIGN
BUILD
TEST
TRAIN/DEPLOY MAINTENANCE
TO THE DOCUMENT OWNER: This template is provided as a guideline and resource. The structure and instructions
give detail about what might go into a completed document. Only you and your team, however, know what will
best fit the needs of your specific effort, and you are encouraged to adapt the template as appropriate to meet
those needs.
Research Summary
The Research Summary provides an overview of the research and data gathered for the project, as well
as the recommendations that should be used to guide the project. The Research Summary will be used
to present the research methodology and results to project stakeholders for approval of recommended
action items. The goal of the Research Summary is to narrow the field of solution options through highlevel research and recommend a small set of options for more detailed evaluation; see the Product
Selection Methodology at http://www.utexas.edu/its/projects/selection.php, and consult management
for additional expectations.
Executive Summary
Provide the top 3-5 take-away points, results, or recommendations across all the research performed.
Include the research methods that led to each recommendation.
Research Summary and Results
Include a description of the goals of the research and the reasons behind the methods that were chosen.
For each research method used, document the most important take-away points or results.
Surveys
Surveys can be used to gather information from existing and potential customers, users of a product or
technology at other universities, etc.
Focus Groups
Focus groups may be used to elicit feedback, emerging needs, and other information from stakeholders,
system users, and support personnel.
Interviews
Individual or small group interviews may be particularly effective with key users and stakeholders or with
experts in the area under review.
Page 1 of 3
Project Name
Document Version 1.0
Internal Benchmarking
Are there other groups at the university that have created a solution for this problem? What can be
learned from these efforts? Initial queries may be directed to the it-talk and txedge mailing lists, as well
as to targeted departments and user groups.
Peer Institution Benchmarking
What do peer institutions have to say about the project? Do they have experience with similar efforts,
products, or technologies? Peer institution benchmarking, including responses from at least 3-4
universities, is expected for solution assessment. The list of the university's peer institutions is available
at http://www.utexas.edu/reporting/publications#ut-comparison-group .
Historical Data
Has this been tried before, at UT Austin or elsewhere? If so, what lessons have been learned from those
efforts?
Info-Tech Review
What does industry research say about the challenge? This type of research can be used to identify
potential products, technology trends, and alternative approaches to the project.
Peer Journal Review
Has the research subject been discussed in a journal such as Educause Quarterly? Journal publications
can provide, for example, details on attempts to address similar problems.
System Performance Data
What support metrics are available? Can information from a system currently in use provide insight into
changing usage patterns, bottlenecks, peaks, etc.?
User Interface Testing
Methods such as card sorts, wireframe testing, etc. may be used to elicit feedback on design, business
processes, and workflow.
Comparative Analysis
A comparative analysis of available products and/or services is required. Details such as technical
compatibility, implementation overhead, licensing and support options, implementation timeframe,
maintenance costs, and distinguishing factors should be included. Additionally, the comparative analysis
should include assessment requirements, a sub-set of the project requirements consisting of "show
stoppers" without which a given solution would not be considered viable -- for example, technological
constraints or core functionality.
Some general criteria for comparative review include:
Research Summary
Page 2 of 3
Project Name
Document Version 1.0





Will the system be hosted on campus or externally? If the system is hosted externally, which oncampus systems will it need to integrate with (e.g., identity management, authentication,
monitoring) and how will it do so?
Will any changes or upgrades be required to on-campus systems to integrate with the new
system?
Will the system store sensitive data, including logs? If so, are the appropriate security measures
in place?
Can the system's data be exported, for example, for sharing with another system?
Will new skills be required by university personnel to use or manage the system? How will these
skills be acquired?
For an example solution assessment matrix, see the Appendix of the Centralized Authentication System
Assessment (CASA) Research Summary:
https://www.utexas.edu/its/casa/governance/protected/CASA%20Research%20Summary%20and%20Re
commendations_v1_1.pdf
Recommendations and Next Steps
Describe the overall recommendations which have been determined from the research and outline next
steps. For solution evaluations, the recommendations should include narrowing the options to a small
subset, such as 2-4 solutions, that can be reviewed in more detail. If vendor products are part of the
solution, you must contact the Purchasing department to assist in determining next steps.
Appendix
Include detailed research results as appropriate.
Revision History
Version
Date
V1
Research Summary
Updater Name
Description
Initial draft completed.
Page 3 of 3
Download