Presentation at the Amsterdam Sakai conference on the process followed at CSU to select Sakai

advertisement
A Suggested Methodological
Framework for Evaluating and
Selecting an Open Source LMS
Dr Philip Uys <puys@csu.edu.au>
Manager, Educational Design and Educational Technology,
Centre for Enhancing Learning and Teaching
Matt Morton-Allen <mmorton-allen@csu.edu.au>
Teaching, Learning and Community Source Liaison Officer
Division of Information Technology
Introduction
• Set out in Feb 2006 to enhance the virtual
learning environment
• Became the Online Learning Environment (OLE)
Programme
• Originally focused on individual tools but morphed to
framework emphasis
• Started with 12 possible solutions
• Mix of open source, commercial and in-house options
• Ended with two open source
• Selected Sakai
Fast Track Approach
• Initially used “fast track” approach
• Attempted to avoid lengthy investigation of
requirements
• Focus on reusing previously supplied high
level business requirements
• Success hinged on ability to easily identify low
risk solution
Fast Track Approach (cont.)
• Reality was that too little information meant
too many options
• Too many options meant too high a risk
• High risk combined badly with lack of process
transparency
• Also did little to bring together cross-silo
issues between requirements
A Different Approach
• Once “fast track” abandoned needed
alternative
• Extensive experience in the group not
sufficient to address OSS complexities
• Short environment scan showed two possible
frameworks:
• Business Readiness Rating
• Open Source Maturity Model
Business Readiness Rating
http://www.openbrr.org/
•
•
•
•
Geared at helping evaluate OSS
Identifies 12 criteria each with their own tests
Suggests only portion of these be applied
Assigns a weight to each test within a criteria
giving end score
• Has online records of submissions made by
others
Business Readiness Rating
• When looked at closely BRR has some flaws
• The measures of the tests within criteria are high
level and generic
• Leads to need to revise for local use
• Or be faced with possibility of having undiminished
pool of options
• Either means you need to move beyond BRR for a final
decision
• In retrospect could have been useful early on as a
filter
Open Source Maturity Model
http://www.navicasoft.com/pages/osmm.htm
• Another model that could be considered – but limited
• The OSMM assesses the maturity level of all key product
elements:
•
•
•
•
•
•
Software
Support
Documentation
Training
Product integration
Professional Services
A Different Approach
• When neither BRR or OSMM seemed to fit began to
consider afresh
• Agreed on the need for a framework that will be:
•
•
•
•
Flexible – willingness to adapt throughout
Aligned – consistent with strategy
Comprehensive – extensive and in-depth investigation
Transparent – rigorous debate
• Devised the FACT framework for our own needs
The FACT Framework
1.
2.
3.
4.
5.
6.
7.
8.
Identify requirements
Weigh the requirements
Identify possible solutions
Identify “killer” requirements
Apply “killer” requirements
Determine short list
Identify overarching concerns
Apply overarching concerns
1. Identify Requirements
• Utilised collaborative process to create
extensive (> 40) requirements list
• Sources included strategy documents, feature lists
from commercial and OSS products, team
member experience
• Split into high medium and low priority
• Identified levels of compliance with each
requirement or “criteria”
2. Weigh the Requirements
•
•
•
•
•
Next we gave a weighting to each requirement
Again followed a highly collaborative process
Required several iterations to get consensus
Split 1000 points over 40 requirements
Revised several weightings when unable to
differentiate possible solutions
• Always done in collaborative and transparent way
3. Identify Possible Solutions
• Compiled list of possible solutions
• Derived from a number of sources including
team expertise, industry reports, peer
institutions etc
• Resulted in list of 12 options
• It was hoped this might be trimmed
4. Identify Killer Criteria
• Realised evaluating over 12 products against
40 requirements would take a long time
• Decided some requirements were “show
stoppers” and thus “killed” the option
• Collaboratively decided which of the
requirements had a criteria level that was
unacceptable
5. Apply “Killer” Criteria
• Applied killer criteria to each of possible
solutions
• Once a killer had been reached further
analysis was stopped
• Not all “killers” consider equal - some options
needed more than one to be removed
• Reduced the list of 13 options down to 5:
• Blackboard, Angel, In-house, Moodle & Sakai
5. Apply “Killer” Criteria (cont.)
• Removed options for a number of reasons:
• Mergers
• Insufficient local support
• Small user base
• Interestingly cost did not rule out any options
at this point
6. Determine Short List
• From the list of 5 we then removed:
• Blackboard – concerns over lack of competition,
little leverage to control costs
• In-house – advantage of reinventing wheel
questionable, OSS seemed to have same benefits
without starting from scratch, lack of agility
• Angel – user base too small, too high risk,
detriments of commercial without benefits of size
• Leaving us with Moodle and Sakai
7. Develop OACs
• After many months of effort the quantitative
analysis gave near identical scores:
• Sakai – 2428, Moodle – 2402
• If quantitative comparisons had come up empty
what about qualitative ones?
• Developed “overarching concerns” or OACs:
• Completely qualitative
• Focus on general ideology not current features
• Designed to ensure alignment between culture of
solution and the University
7. Develop OACs (cont.)
• Ended up with 10 OACs covering range of
issues:
• Was the community decision making centralised
or decentralised?
• Was the product enterprise oriented?
• Was the product stronger in secondary or tertiary
sectors?
• Was the community more technically or more
pedagogically focused?
8. Apply the OACs
• Once compiled the OACs were applied to the
short list of Moodle and Sakai
• Four major stakeholder groups asked to
decide on Sakai or Moodle for each OAC
• End result looked like …
8. Apply the OACs (cont.)
8. Apply the OACs (cont.)
• 3 of the 4 team members agreed but
consensus could not be reached after intense
debate
• A final Steering Committee vote selected
Sakai unanimously
…
Observations
• The deeper the analysis the more possible
solutions you can remove
• Shallow analysis using models such as BRR can
be useful in early stages
• Quantitative comparisons are less meaningful
when you can change any aspect of the
software – there’s lots of grey areas
Observations (cont.)
• The introduction of qualitative measures is
unavoidable and should be accepted
throughout
• Qualitative comparison can only be accepted
in an environment of transparent rigour
• Qualitative measure can only follow
quantitative comparison – it lacks conviction
in isolation
Observations (cont.)
• Removing cost from the equation helped
compare OSS and commercial
• Assuming cost is near equal over a period of time
removes bias and misconception (i.e. no “free
lunch”)
• Evaluations require consideration of local
needs and politics – highly strategic decisions
cannot be based on off shelf comparisons
Observations (cont.)
• Requiring consensus was time consuming but
gave strength to the results:
• Forced rigorous debate
• Ensured transparency throughout
• You cannot rush decisions this large – taking
the time allowed a considered decision
Observations (cont.)
• The framework wouldn’t have worked outside
the context of the project management
methodology
• A framework needs to be contextualised
within the organisational culture and
strategies
A Different Approach
The FACT framework
• Flexible – willingness to adapt throughout
• Aligned – consistent with strategy
• Comprehensive – extensive and in-depth
investigation
• Transparent – rigorous debate.
Thank You!
For more information
http://www.csu.edu.au/division/landt/interact/
Dr Philip Uys <puys@csu.edu.au>
Manager, Educational Design and Educational Technology,
Centre for Enhancing Learning and Teaching
http://www.csu.edu.au/division/celt/exec_staff/philip.uys
Matt Morton-Allen <mmorton-allen@csu.edu.au>
Teaching, Learning and Community Source Liaison Officer
Download