Chapter 1 - University of South Australia

advertisement
The Role of Cisco VIRL
In Network Training Environments
BRADLEY MARK HERBERT
Supervisor Dr. Grant Wigley
1
Introduction
This (minor) thesis investigates the role of a new software
platform called Cisco Virtual Internet Routing Lab in an
academic training environment for learning computer
networking.
• Reasons for this research
•
•
•
•
Existing tools only appeal to select learning styles [1] [2].
Existing tools lack some functionality of real-world Cisco equipment.
Real-world equipment is costly, inaccessible and high risk [3] [4].
Use of real-world equipment does not support external students
2
1
Introduction (CONT)
• This Minor Thesis is not:• Research on the usability of Cisco VIRL
• Research on integrating VIRL into a training environment
• Research on a number of different simulators
• This Thesis explored and researched:• The technical features of Cisco VIRL, but only in respect to
assisting with learning.
• A comparison of VIRL with other tools to argue VIRL’s
relevance
• Exploration of learning and behavioural sciences
• Importance of the practical approach
3
Research Question
• This Minor Thesis aimed to address the following
research question
“What role does Cisco Virtual Internet Routing Lab play in
network training environments to help students and
trainees understand advanced networking concepts?”
• Should Cisco VIRL be used at all in training environments?
• If so, when and how should the application be used?
4
2
Learning Approaches
• Theoretical Approach
• Lectures
• Exams
• Written Assignments
• Practical Approach
• Exercises / Assignments / Exams
• Packet Tracer
• NetLab NDG – Used at UniSA
• Real Equipment (without NetLAB)
• Emulation i.e. GNS3
• Simulation i.e. CORE
• Others
• Demonstration
• Teacher uses a visualisation tool to demonstrate a concept
• Self-Learning
• Use of a tool at home to practice networking
5
3
Literature Findings
According to the literature, various challenges exist in teaching
networking:• Learning Challenges
•
•
•
•
Existing tools only appeal to select learning styles [1] [2].
Existing tools used to measure learning outcomes, not increased skills or knowledge
Use of real-world equipment does not support external students
Impoverished international students may need additional learning support
• Managerial Challenges
• Insufficient access to real equipment
• Students must work in large groups because of costs
• Equipment (often accidentally) broken or damaged.
• Limited access in third world countries
• Technical Challenges
• Existing tools lack functionality of real-world Cisco equipment.
• Real Equipment lacks learning features i.e. exam grading, activity planning, visual
abstraction, topology, etc
6
4
Learning Methods
• If Person A completes a networking lab successfully
and quickly, have they really learned?
• According to ‘A Survey on the Challenges Faced by the
Lecturers in Using Packet Tracer Simulation in Computer
Networking Course (2014)’, the answer is:• Not Necessarily
7
4
Learning Methods (CONT)
• Lecturer quoted in the aforementioned paper [3]:
“Students are not able to answer the questions or execute any basic
configuration instructions learned at the initial stages of the course; it
seems as if the students have never learned them. It forces us to repeat
and repeat the process, this is time-consuming. Students only manage
to follow laboratory instructions, step-by-step but they fail to understand
the concept or theory used in the laboratory activity.
This problem is prominent for the topics like routing protocol, subnetting
and ACL. Students only memorize the instructions or the theories learned
without really knowing the actual time or circumstances that the
instructions or theories should be applied.”
SOURCE: Survey on the Challenges Faced by the Lecturers in Using Packet Tracer
Simulation in Computer Networking Course, (2014, pp. 13)
8
4
Learning Methods (CONT)
• In the paper, ‘Using a network simulation tool to
engage students in active learning enhances their
understanding of complex data communications
concepts’(2005)
• The technology itself does not enhance learning
• The use of simulation tools results in a perceived learning
benefit
• Contrary evidence shows that often students do not
have a sound understanding despite use of the tool.
• Practical activities followed step by step
• Step by step routine memorised without understanding [5]
9
4
Learning Methods (CONT)
• A practical approach is superior to a theoretical
approach when learning about computer networks.
• This is a misnomer.
• According to the paper, ‘Design Patterns for Learning and
Assessment… (2009)’, learning computer networks requires more
than a practical environment alone.
• According to the paper, ‘Dry laboratories’ in science education;
computerā€based practical work (1998)’, practicals take a lot of effort
to plan and make it difficult to demonstrate abstract concepts.
• Abstract Conceptualisation – The ability to explain a
concept in your own words based on your own
understanding
• Requires a theoretical body of knowledge.
• Therefore both a practical and theoretical approach are beneficial
10
4
Learning Methods (CONT)
• Too often:-.
• A practical approach is not used
• According to the paper, ‘FROM THEORY TO PRACTICE:
ADAPTING THE ENGINEERING APPROACH (2012)’, countries
like China perceive professors are always right, impacting learning.
• In the aforementioned paper, impoverished countries cannot afford
real equipment and rely on theory only.
• Active Experimentation – The ability to apply what
you know in a real world environment
• For networking, only a practical approach is suitable for this style of
learning
• According to ‘From Theory to Practice – Adapting the Engineering
Approach (2012)’, those who had not practical experience had
difficulty applying the concepts.
11
4
Learning Methods (CONT)
• Too often:-.
• Learning is approached from an assumed knowledge
• In western culture, students have probably already used computers
for years
• Easily understand the concepts due to prior knowledge
• Lack of exposure to technology while young can have detrimental
impact on learning technical subjects
• The brain develops best while the person is young
• This development is essential for later learning
• Without a practical approach, students are unlikely to be exposed
to technology
• Practical approach even more important for impoverished students
due to their lack of exposure with technology
12
4
Learning Methods (CONT)
• Different Learning Styles:-.
• Different learning styles may necessitate different tasks
• Active Experimentation – Allow students to freely design their own
network topologies – Not possible with real equipment!
• Long Distance - Allow students to work from home or remote
locations. Not possible with real equipment.
• Allow students to troubleshoot networks using advanced features
• Many debugging missing from PT
• Spanning-Tree is a simulated feature with no realism
• Students with disabilities may struggle to see the colours and
animations in some tools such as Packet Tracer
• Tailored labs – Tailored labs to the student’s learning needs and
desired style of learning. Not possible with real equipment / NetLab
13
5
Evaluation of existing tools (CONT)
• Evaluation of existing tools and platforms:-.
• Literature outlines issues with real equipment
• Education Institutions face challenges when using real equipment
• High maintenance and cost – Estimated $AU50,000
• Time wasted setting up the equipment
• Often integrated into a so-called “Core” backbone
• Therefore, topology not easily changeable
• Wastage of computing resources
• Lab equipment utilising less than 5% computing resources
• Separation needed for stability and security
• No abstraction
• No visual topology
• No visualisation of packet movement (cf. Packet Tracer)
14
5
Evaluation of existing tools (CONT)
• Evaluation of existing tools and platforms:-.
• NetLab – Used at University of South Australia
• Uses a web interface to interact with real equipment
• Abstraction introduced – Students no longer feel the real
equipment
• Management issues in terms of cost and implementation
• Topology cannot be changed or redesigned
• May impact students learning abilities
• No visualisation tools
• Cannot see packet movement
• Not really a problem because visualisation does not necessarily
help with learning
• Contradictory papers on this issue
• Limited Equipment – Not solved by NetLab!
15
5
Evaluation of existing tools (CONT)
• Virtualisation of Cisco IOS devices:-.
• Only two tools can do this (to our knowledge):1. Open Source GNS3
• But there are problems:• It uses full blown emulation so performs badly
• Requires an image of the Cisco operating system
• Not easy to get legally
• Licencing issues may prevent IOS from being used
• According to literature, no support for Cisco switches
2. Cisco VIRL
• The focus of this research
• Virtualises Cisco routers and switches
• Images licensed to run on VIRL
16
5
Evaluation of existing tools (CONT)
• The following table shows key differences between
the tools:-.
Tool
Full
Commands
Visualisation
Design
Cisco IOS
Topology Features
Cisco Packet
Tracer
No
Yes
Yes
GNS3
Yes
Topology Only Yes
Routing
Common
Open
Research
Emulator
No
Topology Only Yes
None
Real
equipment
Yes
No
No
All
NetLab
Yes
Topology Only No
All
Most
17
5
Evaluation of existing tools (CONT)
• Effectively no cost tools:• Cisco Packet Tracer
• Provided freely to enrolled Cisco students
• GNS3 (Open Source)
• But the IOS image has to be acquired lawfully at a cost
• May not be lawful to use IOS image with GNS3
• iNetSim
• Developed by Curtain University (WA)
• Designed for students with vision problems
• Download at <
http://www.cucat.org/general_accessibility/inetsim/iNetSim/iNetSim
%20-%20An%20accessible%20network%20simulator.html>
• Runs on Mac platform ONLY
18
5
Evaluation of existing tools (CONT)
• Tools or platforms that cost money:• Boston NetSims
• Third Party CCNA tool
• Costs money
• NetLab
• NetLab is not cheap
• Must be purchased in addition to real equipment
• Real Equipment
• Discounted equipment available from Cisco
• Old switches off of eBay will not suffice
• IOS has to be modern to demonstrate up to date education
principles
• IOS image if planning to use with GNS3
19
5
Evaluation of existing tools (CONT)
• Cisco VIRL software interface:-
20
5
Evaluation of existing tools (CONT)
• Cisco Packet Tracer software interface:-
21
5
Evaluation of existing tools (CONT)
• Some (debugging) commands fail on PT:-
22
5
Evaluation of existing tools (CONT)
• Same debugging works on Cisco VIRL:-
23
6
Research Methodology
• To measure the learning of students when using
Cisco VIRL
• Challenges
• Learning affects – Students positively comment on the tool
because it is new
• Students already knew the concepts being evaluated.
• Assessing knowledge or efficiency at using the tool does not imply
active learning
• Approach – Use triangulation to evaluate the effectiveness
of Cisco VIRL in an academic environment.
24
6
Research Methodology (CONT)
• Triangulation
• Use a combination of
different
approaches
Perceptions
Knowledge
Observation
25
6
Research Methodology (CONT)
• Triangulation
• Participants will sit in F2-55 and interact with Cisco VIRL to
undertake a network practical activity.
• Measure Knowledge and Understanding
• Use short answer questions to get participants to apply knowledge
• Issue a pre-test before using the tool to get the benchmark
• Issue a post test after using the tool to measure improvement
• Compare the post test results with pre-test results
• Requires approval from the UniSA Ethics Committee
• This has been approved
26
6
Research Methodology (CONT)
• Experimental Design
1. Pre-Lab Questionnaire
• Participants complete questionnaire before using VIRL
• Determines understanding prior to using VIRL
• Short Answer networking questions
• Determines their perceived understanding
• For example, “How well do you think you understand X?”
2. Practical Exercise
• Participants undertake a series of tasks using Cisco VIRL
• Teaches them networking concepts using VIRL
• Participants troubleshoot networking problems using VIRL
• Tests their ability to apply learned knowledge.
27
6
Research Methodology (CONT)
• Experimental Design (CONT)
3. Post Lab
• Participants repeat the same test given in pre-lab questionnaire
• Compare to results of pre-test
• If post test is a significantly higher numerical score than pretest, it shows positive learning
• Not necessarily the only test to conclude X
• Therefore, it is also necessary to evaluate their performance during
the practical
• Based on qualitative observation
• Also necessary to evaluate their perceptions
• How do perceptions line up with actual learning?
28
6
Research Methodology (CONT)
• Knowledge will be measured numerically
• Based on level of understanding shown during short answer
questions
• Deeper understanding results in a higher score
• Score out of 5
• Quantitative
• Observation
• Video capture of desktop session
• Used to observe their approach to troubleshooting
• Qualitative
• Perceptions
• Simple questions to get the participant’s opinion in respect to how
they perceive VIRL
• According to ‘Effects of interactivity on students' intention to
use simulation-based learning tool in computer networking
education’ (2012), negative perceptions can impact education
29
7
Results (CONT)
• Research results being finalised at this time
• If you wish to know the results, I suggest you read the
thesis
30
Acknowledgements
• Dr. Grant Wigley
• PhD in Information Technology
• Research Supervisor
• Dr. Ross Smith
• PhD in Information Technology
• Provided advice on questionnaire design
• Dr. Nina Evans
• Provided advice on research methods and evaluation
• Mr. Michael Head
• Teaching and Learning Advisor
• Assistance with thesis structure
31
Acknowledgements (CONT)
• Mrs Tracy Britz
• Research Librarian
• Provided assistance with researching literature
• Robert J. Mislevy
• University of Maryland (USA)
• Co-author of a paper I needed access to
• Emailed him directly
• Gladly sent me a final draft of the required paper
• Activity Theory and Assessment Theory in the Design and
Understanding of the Packet Tracer Ecosystem (2009)
• Not accessible through Library resources
32
Acknowledgements (CONT)
• Dr. Stewart Von Itzstein
• Course Coordinator
• Provided assistance with administrative issues
• Others
• Anyone else who helped me but I forgot to mention
• Participants of the experiment
• Miscellaneous UniSA staff
• Family
33
References
• [1] J. Holvikivi, "FROM THEORY TO PRACTICE: ADAPTING THE
ENGINEERING APPROACH," in CONFERENCE ON ENGINEERING
EDUCATION 2012, 2012, p. 78.
• [2] Elias, MS & Ali, AZM 2014, 'Survey on the Challenges Faced by the
Lecturers in Using Packet Tracer Simulation in Computer Networking
Course', Procedia - Social and Behavioral Sciences, vol. 131, pp. 11-15.
• [3] W. Makasiranondh, S. P. Maj, and D. Veal, "Pedagogical evaluation
of simulation tools usage in network technology education," Engineering
and Technology, vol. 8, pp. 321-326, 2010.
• [4] D. C. Sicker, T. Lookabaugh, J. Santos, and F. Barnes, "Assessing
the Effectiveness of Remote Networking Laboratories," in Frontiers in
Education, 2005. FIE '05. Proceedings 35th Annual Conference, 2005, pp.
S3F-S3F.
34
References (CONT)
• [5] C. Goldstein, S. Leisten, K. Stark, and A. Tickle, "Using a network
simulation tool to engage students in active learning enhances their
understanding of complex data communications concepts," presented at
the Proceedings of the 7th Australasian conference on Computing
education - Volume 42, Newcastle, New South Wales, Australia, 2005.
• [6] L. Hsin-Ke and L. Peng-Chun, "Effects of interactivity on students'
intention to use simulation-based learning tool in computer networking
education," in Advanced Communication Technology (ICACT), 2012 14th
International Conference on, 2012, pp. 573-576.
35
Thank you
• Thank you for being here for this presentation
• Any questions?
36
Download