UNIVERSITY OF MALTA THE MATRICULATION CERTIFICATE EXAMINATION INTERMEDIATE LEVEL COMPUTING May 2011 EXAMINERS’ REPORT MATRICULATION AND SECONDARY EDUCATION CERTIFICATE EXAMINATIONS BOARD IM EXAMINERS’ REPORT MAY 2011 IM Computing May 2011 Session Examiners’ Report Part 1: Statistical Information A total of 127 students applied for the May 2011 Intermediate Computing examination session. Eight of these were private candidates. Two candidates did not present their coursework exercise while one candidate was absent for the written paper. The weight of the written component is 80% of the global examination mark while the remaining 20% is carried by the coursework exercise. For this session, the mean mark for the written paper was 49.3 while that of the coursework amounts to 15.1. Thus the average for the examination is 64.4, a decrease of 1.5 marks when compared to the mean of the previous year. Chart 1 and Table 1 below show the distribution of the global marks (written paper plus coursework) as scored by the candidates. Frequency of marks 35 25 20 15 10 5 Class intervals Chart 1 Class intervals 0–9 10 – 19 20 – 29 30 – 39 40 – 49 50 – 59 60 – 69 70 – 79 80 – 89 90 – 100 Table 1 2 Frequency 0 1 0 11 20 23 19 32 18 3 80-89 90-100 10 70-79 9 60-69 8 50-59 7 40-49 6 30-39 5 20-29 4 10-19 3 0-9 2 0 1 No. of candidates 30 IM EXAMINERS’ REPORT MAY 2011 Table 2 below shows the grades obtained by the candidates and the percentage of each grade. Grade A B C D E F Absent* Number of candidates 8 23 37 27 19 13 0 Total Percentage of candidates 6.3% 18.1% 29.1% 21.3% 15.0% 10.2% 0.0% 127 Table 2 100% * Candidates who did not present their coursework AND did not turn up for the written paper. The Coursework Component During the moderation of project exercise, the moderators visited all the schools/colleges that prepared students for this examination. An overview of the moderators’ comments is the following: In most cases the marks awarded by the tutors for the projects were acceptable. However in a particular school/college, the moderator reduced a couple of marks across the board. The moderators noted that most schools/colleges conduct an interview with each individual candidate to confirm the authenticity of the work presented. This is to be commended. In a handful of cases the tutors noted that the project showed clear signs of plagiarism and the tutors rightly penalised the candidates accordingly. Item Analysis of Written component Table 3 below shows the Maximum mark that could be scored for each of the 12 items in the written paper, the Mean mark scored and the Standard Deviation for each item. The table also shows the Facility Index for each item – the index may range from 0, for an item in which candidates obtained 0 marks, to 1.0 for an item in which all candidates scored full marks. Item Number A1 A2 A3 A4 A5 A6 A7 A8 A9 A10 B1 B2 Maximum Mark 6 6 6 6 6 6 6 6 6 6 20 20 Standard Deviation 1.3 2.2 2.3 1.3 1.8 1.7 1.5 1.7 1.6 1.3 3.9 3.5 Mean 4.6 3.9 3.8 3.9 3.0 3.3 4.3 3.6 3.4 3.0 14.8 12.3 Table 3 3 Facility Index 0.8 0.7 0.6 0.7 0.5 0.6 0.7 0.6 0.6 0.5 0.7 0.6 Choice Index 0.3 0.7 IM EXAMINERS’ REPORT MAY 2011 The Choice Index given in the table above is a measure of the popularity of an item – an index of 0 indicates that an item was not chosen by any candidate; while an index of 1.0 shows that an item was selected by all candidates. The choice index only applies to the two items in Section B because the items in Section A are compulsory. Chart 2 below shows the Facility Indices in graphical format. Facility Indices 0.9 0.8 Facility index 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0.0 A1 A2 A3 A4 A5 A6 A7 A8 A9 A10 B1 B2 Question number Chart 2 Table 4 below shows the items in decreasing order of facility, together with the topic that the question tested. Item Number A1 A2 A4 A7 B1 A3 A6 A8 A9 B2 A5 A10 Facility Index 0.8 0.7 0.7 0.7 0.7 0.6 0.6 0.6 0.6 0.6 0.5 0.5 Topic tested Logic circuits and function of Simplification of Boolean expressions CPU architecture Computer applications System design Operating systems Software and checking Number systems and data protection act Networks Networking and communication Sorting algorithms Database concepts Table 4 4 IM EXAMINERS’ REPORT MAY 2011 Markers’ Comments on the written component The markers’ comments on individual items are being reproduced, ad verbatim, below: A1 The majority of students got this question correct with typical answers giving a complete and correct truth table. Very few students got a couple of rows wrong while even fewer did not understand it completely and gave a completely wrong answer. The same applies to the second part of this questions which should have been ‘half adder’. Around 80% of students got it right and around 10% said full adder, while the other 10% had no idea. A2 Nearly all the students (85%) gave this give-away question correct. Only a very few (10%) did not get to the correct answer of A+B and eventually drew the corresponding circuit getting marks just the same. The last 5% of students had no clue of what needed to be done. A3 The majority of the students (80%) got this question correct and the rest were divided by either a misconception of giving examples like Windows, Mac, etc ... ànd others who did not know the answer at all.. A4 This question about the CPU was very well answered by all students with the exception that very few students left minor details missing losing few points in the process. A5 Even though the majority of students know what sorting is all about, some 25% failed to give a practical example of where a sort is applied in reality. When students had to explain how Insertion and Bubble sort work half the students knew exactly that Insertion sort introduces an item at a time ensuring it’s correct place while a Bubble sort compares neighbouring items and swaps them if necessary. The other half failed to give a simple explanation. A6 Most candidates answered part (a) correctly, with only few who could not identify the two major categories of software. As regards part (b), quite a few did not provide good test data to validate the entry of an examination mark. A7 Most candidates answered this question well. Some did not know what a VLE is or could not remember an example of a VLE. A8 Many candidates did not know the range in decimal for an 8 bit register, and although nearly all knew how to convert a positive decimal to binary, quite a few did not know how to convert a negative number. Many candidates could not differentiate between the data protection commissioner and the data controller. A9 Few candidates could not identify the difference between the Internet and WWW. As regards part (c), many could not identify the real function of a server and proxy server. A10 Most candidates could not list 4 effective advantages, or they gave the same advantage using different wording. As regards part (b), some candidates confused a flat database with a hardcopy database. 5 IM EXAMINERS’ REPORT MAY 2011 B1 Less than half of the candidates selected this question and only a few got it perfectly correct. The idea that a well-established methodology that has been proven over and over and that its been tested to offer a sequential and incremental development was not known to all. Even more the top-down and bottom-up approaches was given by half the candidates who attempted the question, with the other half giving a wrong reply. The rest of the question was well replied to with majority of candidates knowledgeable in modular designs as well as the other steps within the life-cycle. Still around 10% failed to give a correct answer. B2 As regards part (a), most of candidates provided adequate descriptions togther with examples of LAN, MAN and WAN. Very few answered incorrectly the questions about network topology. While many knew what the OSI model is and knew the names of the seven layers, very few could describe correctly each layer. None of the candidates could identify the data units for each layer and most candidates left this part unanswered. Chairperson Board of Examiners July 2011 6