Msc. Thesis-Computer Science

advertisement
Msc. Thesis-Computer Science
1.
USMAN,
GAMBO
ABDULLAHI
PGS/00295
MODELLING AND
SIMULATION
STUDIES OF
ARTIFICIAL
PANCREAS
1994
2.
HAROUNA,
NAROUA
PGS/00321
DEVELOPMENT OF
A SOFTWARE
TOOL FOR THE
EVALUATION OF
STRUCTURALISM
IN dbase
PROGRAMS
1994
The application of the methods of mathematical
modelling, dynamic systems analysis and simulation is
playing an increasingly important role in the study of
biological systems in general, and metabolic and
endocrine processes, both in physiology and in clinical
medicine. The proper use of such techniques usually
provided the physician, medical or biological research
worker in the life sciences, with a greater understanding
of the nature and behaviour of the complex processes that
occur in physiology. Equally, mathematical models are
now increasingly being used as aids in the diagnosis and
treatment of diseases. This study essentially looked at
Diabetes Melitus by studying the models of the Artificial
Pancreas. Real world data about diabetic patients were
collected from some hospitals and processed. A
simulation experiment using the GPSS Simulation
Language was designed and implemented. The results
obtained from the Computer Simulation of two
alternative control algorithms for the artificial pancreas,
namely, the Albisser and the Biostator-Miles algorithms,
indicated that the Biostator algorithm performs better
than the Albisser algorithm in normalising the diabetic
patient. It was also discovered that the route of insulin
administration plays an important role in the
normalisation of the metabolic state of a diabetic patient.
This is indicated by the performance of the portal route
as compared to the peripheral route.
The art of programming seems to be full of elusive goals.
There is no doubt that software can be produced in an
efficient, timely fashion. It is known to all of us and we
have even tried it. But, we have always asked ourselves
the following questions: Is our program going to fail?
When and how is our program going to work? A serious
work has been done after that goal. Brilliant people have
3.
MUKTAR,
LIMAN
AHMED
PGS/00296
STUDIES ON FILE
ORGANISATION
TECHNIQUES: A
CRITICAL AND
COMPARATIVE
ANAYSIS
1994
toiled for years to find the answers to those questions and
they have produced masterpieces. For that same purpose,
a complete static analysis was carried out on the
Abubakar Tafawa Balewa University Computer Science
final year students projects. The software engineering
environment was restricted to programs written in dBase.
To prove a program correct, it was tested for
structuralism using the static analysis of program
segments. Both theory and algorithms were developed on
structuralism. Finally, a package was developed to assist
the analysis by providing statistics on the program
structure based on program reduction. It was observed
that the students made good use of SESE (SINGLE
ENTRY, SINGLE EXIT) constructs. It was also
observed that the students made use of dBase control and
data structures. The major source of students problems
was found to be the use of LOOP statement.
Five file organisation techniques were studied, analysed
and compared. These techniques include Sequential,
Indexed-sequential, Direct, Multi-indexed, and Multiring
indexed, and Multring files. Each of these methods is
studied and analysed based on the data transfer rates,
average seek times of some direct access devices. Some
hardware parameters were derived and in turn
performance formulas were written for each technique.
Simulation run was carried out for each method. The
performance was measured in terms of record size, time
to fetch a record, get-next record, insert a record, update
a record, read entire file and file reorganization.
Simulation results show that when a fixed record size is
used in an organisation, sequential and multi ring files
require the minimum bytes to store a record hence the
best of all the methods. While Indexed-sequential is the
fastest in fetching a record as well as in getting the next
record, sequential file is the fastest in insertion and
updating of record as well as in file reorganization. The
4.
LAWAN,
ABDULWAH
AB
5.
AKINYEMI,
ADESINA
ALABA
PGS/00688
APPLICATION OF
COMPUTER
SIMULATION IN
MULTIPLE SERVER
FINITE QUEUEING
SYSTEM.
1996
ESTIMATING
SOFTWARE COST
IN A DEVELOPING
1997
direct method placed the second fastest method in
fetching and updating of a record as well as in file
reorganization. Indexed file organisation performs fairly
good in fetch, get-next and insertion of a record. From
the analysis it was found that sequential file organization
gave overall best performance then followed by indexedsequential. Direct method came third while the fourth
position went to indexed file organization. Multiring
method took the last position.
This project presents a computer simulation approach,
which evaluates the performance criteria of a multiple
server finite queueing system for different number of
servers, with a view to determining through comparison
of the result which number of servers gives the optimum
of the performance criteria. Texaco filling station, Jos
road, Bauchi was used as a case study. The arrival rate
follows a poisson process, and the service time are
exponentially distributed. The performance criteria are
measured in terms of average number of vehicles in
queue and in system per minute, average waiting time in
queue and in system per vehicle, average utilisation of
service facility, average number of vehicles serviced per
day and average contribution made as a profit per service
point. The starting data required for the simulation
experiment is obtained from the filling station. The data
collected provided necessary inputs for the simulation
model. The simulation result showed that the number of
service points that produced the optimum of the specified
performance criteria of the system was three. These,
inspite of the fact that the chances of three service points
being idle was increased, this increase was insignificant
when compared with the benefits derived from the
system.
Estimating the cost of a software product is one of the
most difficult and error-prone tasks in software
engineering. The dynamic nature of programming and
COUNTRY:
NIGERIA AS A
CASE STUDY
6.
YUSUF,
BASHIR
KAGARA
PGS/00974
SIMULATION OF
KERNEL MULTITASKING
OPERATING
SYSTEM IN UNIX
ENVRONMENT.
1997
the need for objective pricing system for software
products are pre-requisite for the assessment of different
methods of software estimation. This study looked at
different methods of estimating cost of software with a
view to arriving at an acceptable and widely used
method. An experimental program using Turbo PASCAL
language was designed and implemented for constructive
cost model (COCOMO) and Putnam cost estimation
model. The performance was measured in terms of man
hour, programmer experience, duration, staff size,
number of source codes involved, reliability and lines of
documentation. The results obtained from the Program of
software cost estimation usmg constructive cost model
(COCOMO) vs PUTNAM cost estimation model
methods show that it is cheaper using COCOMO than
using Putnam cost estimation model method. It was
observed that software developers may prefer COCOMO
method to Putnam cost estimation method and other
methods, but may unconsciously make use of expert
judgement and delphi cost estimation methods
interchangeably. It was also observed that in a
developing country the efforts of Analysts and
programmers are not always well rewarded and
recognized, since, their products are assumed to be
inferior to the imported packages which is not always
true.
A general way to make a smaller model of a large
system, or to represent the fact that the observations
possible on it are limited, is to apply an abstraction to it.
simulation techniques have a far reaching impact on
digital computer systems in general, and kernel multitasking operating system implementation is one area that
deserves critical study for efficient management of
computer system resources. The study was an attempt to
model the actual kernel of UNIX operating system and
evaluated various design options and trade-offs as they
7.
BOUKARI,
SOULEY
PGS/00970
PERFORMANCE
EVALUATION OF
DIGITAL
COMPUTER
SYSTEM
1998
were encountered. A simulation model, using C++
programming language, was designed and implemented.
The maj or functions of the Kernel studied include
process management, interprocess communication and
synchronization, interrupt management and startup of the
initial system configuration. The model was implemented
on GATEWAY 2000 486 microcomputer running on sco
UNIX Release 3.2 Version 4.2 operating system.
Numerical experiments on random instances of the
performance metrics suggest that the model performs
satisfactorily on the average. A classical concurrency
problem, the dining philosophers problem, was solved
using the model.
With the rapid involvement of industries and many
business oriented organisations in computing, a tool for
evaluating, analyzing and planning computing systems
becomes imperative. For most computer systems,
response time, resource utilization and throughput are the
primary parameters of performance. In order to evaluate
and predict the above parameters and other aspects of
performance of a digital computer system, it becomes
necessary for us to represent the compute
systems and its workload by some models. This works
presents some tools for evaluating the performance of
digital computer system with a view to determining
through comparison of the results obtained which
approach gives the best optimum performance. NITEL,
North-East Zone and A. T. B. U Computer Centen,
Bauchi, Nigeri a were considered as case studies. The
starting data required for the simulation were obtained
from NITEL MIS Data Processing Unit and A.T. B. U.
Computing Unit. The data collected were used as input
for the simulation model. The results show that the mean
response time and the mean throughput of the system
obtained from the basic simulation model are more
realistic than the ones obtained with queuing model.
8.
ALI, JOAN
BERI
PGS/00833
SOFTWARE
RELIABILITY
MEASURES USING
SOME
RELIABILITY
MODELS
1998
9.
TITIKUS,
ISHAYA
AUDU
PGS/96/01
131
A SOFTWARE FOR
COPING WITH
UNCERTAINTIES
WITH
APPLICATION TO
YOLA-ADAMAWA
STATE AGROCLIMATOLOGY
2000
Simulation model is therefore recommended as a
powerful tool for evaluating and predicting the
performance measures of digital computer systems.
This work uses reliability models to estimate the
reliability of ATBU payroll software. The models used
are the Basic-Reliability, Jelinski- Moranda, SchickWolverton and the Path decomposition models. The
results show that the Jelinski-Moranda and Path
decomposition models with a reliability, R(t), of 8.6 x
10-1 and 7.03 x 10-1, respectively, performed better than
the others. This corresponds to a Mean Time To Failure,
MTTF, of 13.7minutes and 12.99minutes respectively. It
was concluded that the Jelinski-Moranda and Pathdecomposition models could be used in the calculations
of reliability for softwares that have similar
characteristics with the ATBU payroll.
One of the essence of knowledge is enabling human
beings to cope with the odds of their environments,
particularly if such odds can not be gotten rid off. One of
such environmental odds is weather on which man so far
has no control. The study presents how computer
capability could be harnessed to assist man in taking
decisions that will enable him cope with the odds of his
environment. In this regard, Yola-Adamawa State as an
example of areas where farmers battle with Agroclimatological uncertainties was studied. Data on rainfall,
price list of crops, farm inputs and crops yield per hectare
under various climatic conditions. were collected and
analyzed to usher in a crops decision- matrix. The
following uncertainty decision criteria were studied,
presented and incorporated into the software: Maximax,
Maximin, Minimax, Laplace, Hurwicz, Expected
Monetary Value (EMV), Expected Opportunity Loss
(EOL) and Expected Utility Value (EUV). Though, the
most widely used uncertainty decision criterion is the
EMV, yet it has a major shortcoming since it fails to
10.
SULEIMAN,
ABDULMALIK
PGS/01249
A STUDY OF SOME
ROUTING
ALGORITHMS
2000
11.
KWAMI,
ABUBAKAR
MOHAMMED
PGS/0348
APPLICATIONS OF
FOURIER
TRANSFORM TO
THE GENERALISED
D’ALERMBERT
WAVE EQUATION.
2000
address the decision-maker's personal attitude to risk.
This inadequacy hitherto made this decision criterion less
desirable. An Index of Variation (IV), which measures
the risk potentials of each alternative, has been
introduced in this project as a means of solving this
inadequacy. This software, which is of general
application to uncertainty decision- matrices, was then
applied to the crops decision- matrix of the study area,
and the system displayed the economic expectation and
the risk status of each crop. Based on the economic
expectation of the crops which is measured in terms of
Expected Monetary Value (EMV), and their climatic risk
status which, is measured in term of Index of variation
(IV), the system identified respectively, Groundnuts and
Guinea Com as the most viable crops to be grown in
Yola-Adamawa State.
In this project report, a survey was conducted on some
routing algorithms. Based on this survey, an algorithm
for channel routing using a two layer channel which finds
an optimal solution in VLSI design was proposed.
Associated with the channel routing problem are two
constraints: Horizontal and Vertical. Using the horizontal
and vertical constraints graphs, a searching tree was
formed. Using a good dominance rules to terminate
unneccessary nodes in the searching tree and a proper
cost function, the algorithm was terminated and an
optimal solution obtained as soon as a leaf node is
reached. Experimental results show that the algorithm
behaves quite well in average cases which coincides with
theoritical analysis.
Partial differential equations occur in some areas of
science and engineering, and their applications to real life
situation is of great .importance, therefore it is necessary
that some method be employed in solving them.
Although various methods such as separation of
variables, method of characteristics, and laplace
12.
DAPIAP,
STEPHEN
BOERWHOEN
PGS/97/01
268
STUDY OF
ALGORITHMS FOR
DEADLOCK
DETECTION
2000
transtorm method have been proposed and proved to be
successful for certain class of equations, but their
complexity and ambigity make it necessary for an effort
co be devoted to some other methods of solution. As a
result, much attention have been given to the use of
Fourier transform method which is often simpler and
widely accepted. The subject of this project report is
Fourier transform method in the solution based of the
study of on the generalised D'Alemgert- wave.
equation.This equation was solved by Ducati and Capelas
De Oliviera(In1998) using Riemann's method. Also, the
solutions of special cases are Obtained using Fourier
transform method. The result obtained using Fourier
transform method is similar to that obtained using
Riemann's method, and is shorter and simpler compare to
the Riemann's method.
Deadlock Detection, one of the three techniques for
handling deadlock problem was selected and the
algorithms to implement it were analysed and compared.
The comparisons were made in order to measure their
performances and determine the most efficient algorithm
in terms of time and search paths for solving deadlock
problem by deadlock detection. Breath-first search,
Depth-first search, and the Horizontal and Vertical search
were the algorithms analysed and compared. The
algorithms were compared based on representing two
resource allocation systems on directed graphs. The first
system was simple with six process nodes and five
resource nodes whereas the second system was more
complex .with twenty process nodes and seventeen
resource nodes. In each of the systems, three different
situations based on resources requested by processes
were considered: request that resulted in a deadlock
situation, request for a free resource and its immediate
allocation, and request for a resource not free yet waiting
for it was deadlock-free. The programming language
13.
AGAJI,
IORSHASE
PGS/01152
STUDIES IN
2000
COMPUTERBMEMO
RY MANAGEMENT
IN A NETWORK
ENVIRONMENT
14.
EKANEM,
BASSEY
ASUQUO
PGS/9899/404028
COPING WITH
SOFTWARE
MAINTENANCE
COST WITH
RESPECT TO
WARRANTY
TERMS AND
MANPOWER
DEVEOPLMENT: A
CASE STUDY OF
CREATION
NIGERIA LIMITED.
2002
C++ was used to write the programs for carrying out the
required operations on a computer system. Amongst the
three algorithms, the Horizontal and Vertical search
algorithm performed better. The Depth-first search
algorithm and the Breadth-first search algorithm in that
order follow this respectively. The Horizontal and
Vertical search algorithm was therefore found to be most
efficient for implementing deadlock detection strategy.
Memory management in a network environment was
studied. A simulation program which minimises the cost
and suggests the shortest path of page migration across
the network by analysis a corresponding communication
matrix was developed. This program was tested using
star network of a commercial bank with twelve nodes
across Nigeria. The results of the simulation program
were compared to the minimum spanning tree of the
network It was found that the cost of communication is
minimal using the simulation program.
This research work used Software Maintenance Cost
Models and related Models to devise a means of coping
with software maintenance cost for any given warranty
term through careful planning of the release time and
proper manpower deployment. Models used include,
Warranty and Integration-Test, Wolverton PenaltyIncentive, Cost-Overrun Penalties, Mills model, MTTF
model and Reliability models. Results of the research
work conducted on Software Projects of Creation Nigeria
Limi~ed showed that the best way to cope with the
Project Management System (PMS) maintenance within
the warranty period of 12 months is to release it at the
end of the 7th week of its integration test. This will limit
field errors to 24 as against other higher values, if
released earlier. PMS reliability at the release time is 0.7.
For Loans and Project Management System (LPMS)
project, the optimum release time is at the end of the 8th
week of integration. This limits field errors to 33 against
15.
AHMADU,
ASABE
SANDRA
PGS/98/40
4011
A COMPUTER
BASED HUMAN
RESOURCE
INFORMATION
SYSTEM A CASE
STUDY OF
FEDERAL
UNIVERSITY
TECHNOLOGY,
YOLA.
2002
other higher values if released earlier. Extending the
process beyond their respective optimum release time
will definitely reduce the number of field errors but such
extension is not tenable considering the high cost of
additional integration test with the low cost of removing
the field errors. Results also showed that the total
development force will drop to 0 when all the 9 projects
considered would have been released. This is due to the
total workforce being gobbled up in maintenance of the
released projects, a situation that requires urgent attention
if the software organization must survive.
Traditional human resources management systems rely
on the intuition and whimsies of Human Resource
Managers to take critical decisions that concern the
selection, recruitment, development and utilization of
human capital. In small, non-critical rudimentary
industries, this system serves its purpose very well.
However, in organizations where critical talents are
needed, there is the need for highly responsive Human
Resource Information System to aid today's Human
Resource Managers to identify, select, develop and retain
the talents needed to guide the operations of the
organisation in this era of Information Technology. In
this project a computer based Human Resource
Information System is developed. The software provides
a guide on how to effectively and efficiently manage
human resources in a university system such as Federal
University of Technology Yola. The software package
provides inquiries on staff list by identity number, name,
department, rank, state, Local Government Area, year
spent and also the list and the count of staff in a
particular job group (Junior or Senior), and job class
(Academic or non Academic). It provides vital reports
that help management in deciding who should be
appraised, identify staff that requires training, identify
skills that need to be developed, deploy staff from areas
16.
OBEWA,
OLUCHI
JENNIE
PGS/01188
COMPARISON OF
ALTERNATIVE
QUEUING
SYSTEMS IN A
BANK USING
SIMULATION
TECHNIQUES. A
CASE STUDY OF
FIRST BANK
NIGERIA PLC.
BAUCHI BRANCH
2002
of excess to areas of need thus enhancing good staff
utilization. Visual Basic 6.0 provided a veritable platform
for developing the front end using object oriented
programming paradigm while microsoft access was used
to creat and mount the Back end for enterprise
connection as a database server.
When limited service facilities fail to satisfy the demands
for service that are made upon them, bottlenecks occur
which generates queues or waiting lines. In business,
queues have some "economic" or "cost" implications
which include; cost of allowing long queues to build up,
cost of speeding up services to reduce waiting times and
queue lengths. Queuing problems are therefore concerned
with minimizing average waiting times, the average
length of queues, finding number of service points that
ought to be used, the cost of servicing system and the
benefits from cutting down waiting times. In this work, a
software Bank_Queue_Simulator has been developed
using Turbo Pascal for windows to simulate two queuing
models, namely; the single-queue - multi teller model and
the multi-queue - multi teller model. The results of this
experiment show that using the single queue system
running on five (5) tellers greatly reduces the waiting in
the system and the length of the queue to 5.47 minutes
and a customers in queue per minute, respectively. The
cost of employing 5 tellers is less than that for the current
3- teller system. The probability of an arriving customer
finding all tellers busy and that of finding an idle teller is
fifty - fifty. For any meaningful reduction in the waiting
times and queue lengths respectively to be achieved, the
management is advised to consider employing 2 more
tellers in addition to the existing 3- tellers to speed up
service. Though this is subject to what management will
consider reasonable wait time. It is further recommended
that management should consider a re - training program
for its teller staff on effective and efficient services.
17.
OYONG,
SAMUEL
BISONG
PGS/9899/404021
TIME ESTIMATION
TOOL FOR
SOFTWARE
DEVELOPMENT
PROJECTS. CASE
STUDY:
UNIVERSITY OF
CALABAR
COMPUTER
CENTRE (Software
Development Unit)
2003
18.
EKABUA,
OBETEN OBI
PGS/9899/404031
SOFTWARE
COMPOSITION
AND BUILDING
COMPLEX
SYSTEMS: CASE
STUDY OF DECK
OIL SERVICES
COMPANY
LIMITED, PORTHARCOURT.
2003
This research work is aimed at improving the time for
estimating new software projects. A tool was developed
that would be used, given the set of general user
requirements and technical requirements, to estimate the
time to use and complete a given project. The tool made
use of source lines of code which were determined in
terms of size using statistical models, while Effort and
Schedule were determined using Boehm et al (1995)
COCOMO II Models. Putnam and Myers (1992)
Software Life Cycle Management (SLIM) model was
also used to determined the core metrics. The software
development time was computed from the models
developed by the two schools of thought: COCOMO II
and SLIM. In practice, Size estimation and hence Effort
and Schedule estimation are done either from experience
(the Delphi method) or by Analogy. The researcher's
contribution to knowledge is the development of a Tool
that uses the theoretical method, (Algorithmic models),
of determining the core metrics for new projects using
software models. This tool was used or applied in The
University of Calabar Computer Centre to estimate the
time to use and develop Revenue collection program for
Equity Bank Plc. It was found that it is good to use
models from both schools of thought in developing a
software project. No single model can be used to achieve
a satisfactory result.
One persistent problem in software engineering is how to
put software systerns together out smaller sub-systems.
The software systems of today are rapidly growing in
number, sophistication, size, complexity, amount of
distribution and number of usres. Software architectures
promote development focused on modular building
blocks
(component),
their
interconnections
(configurations) and their interactions (connectors). The
emergence of software architecture and architectural style
has focused attention on a new set of abstractions with
19.
EDIM, AZOM
EMMANUEL
PGS/9899/404030
MODEL FOR
SOFTWARE COST
ESTIMATION: A
CASE STUDY OF
MULTISOFT
NIGERIA LIMITED,
PORT HARCOURT.
2003
20.
ETIM, EDEM
EKONG
PGS/9899/404022
COMPUTER
IMPLEMENTATION
2004
wltich we can create and compose software systems and
gain intellectual control of software development. This
thesis objective is to provide a model for the composition
of different architectural styles within software systems
and a disciplined approach to the process of architectural
composition. The properties for reservoir formation (Oil,
Gas, water, Rock) are used as different component (or
modules) which are interconnecled togeUrer to enable
evaluation of the reservoir formation. The advantage is
that all the different components now "talk" to each other
and share information. It is therefore recommended thai
this software be used to enhance calculations involving
reservoir formation evaluation.
Software cost estimation is currently one of the most
researched area in software engineering. The reasons
being that software cost estimation provides the vital link
between the general concepts and techniques of
economic analysis and the particular world of software
engineering. Several models and techniques have evolved
within the last two decades. This research work reviews
some of the existing models. The COCOMO II - multiple
regression and Bayesian analysis models form the central
focus of this study and have been presented in detail.
"SCET" (Software Cost Estimation Tool) was developed
to simulate the COCOMO II model as the main program,
while Putnam (SLIM) and intermediate COCOMO were
added as sub modules. Test data were collected from
MULTISOFT SYSTEMS NIGERIA LIMITED. PORT
HARCOURT. The data have been used to simulate
results of effort, schedule and cost of projects in the
mentioned organization. The results have been presented
and discussed. The result predictions from the COCOMO
II model seems more accurate than other models, this
makes the model more preferable to others.
Iterative methods for solving linear system of equations
are interesting and valuable methods for solving linear
OF THREE
STATIONARY
ITERATIVE
METHODS FOR
SOLVING LINEAR
SYSTEMS.
21.
ESIEFARIENR
HE, MICHAEL
BUKOHWO
PGS/9899/404034
SOFTWARE
PROTOTYPING
SYSTEM ANALYSIS
AND DESIGN (A
CASE STUDY OF
NIGERIA POLICE
CRIMINAL
DETENTION
SYSTEM BAUCHI
METROPOLIS)
2004
22.
EKPENYONG,
MOSES
EFFIONG
PGS/9899/404037
SIMULATING THE
BEHAVIOUR OF
PARALLEL
MACHINES ON
2004
system of equations. In this project, three iterative
methods were studied. Ten systems of linear equations
were chosen for the implementation. The parameters
used in the study were; accuracy of results, number of
iterations, running time and storage. The results show
that successive over relaxation performed better than
Jacobi, and GaussSeidel methods.
This relationship between System Analyst and Client
(users) in software development is such that there are
significant differences in their level of understanding of
the problem and communication difficulty in expressing
software requirement of the problem. These differences
invariably lead to misconstrued system requirement,
completely wrong software specification, late software
deliveries, loss of man-hour and millions of Naira. This
work advocates a client/Analyst involvement in system
requirement elicitation and analysis using software
prototyping techniques. This technique leads to the
design of efficient software that meets the users
requirement and save cost. In demonstrating this concept,
the researcher developed a software prototype for
Criminal Detection System using the Nigeria Police
Crime Department, Bauchi State as a case study. These
throw-away prototypes enable various concepts and
design inherent in software prototyping to be fully
discussed and demonstrated. Using the Prototype, it was
observed that Criminals can easily be detected if the
system is implemented by the Nigeria Police Force. It
was also observed that the use of Prototyping in System
analysis and Design saves time and money and thus lead
to more efficient software where the client and developer
are all involved in the development process.
The efficiency of parallel processing is dependent upon
the development of programming languages that
optimize the division of tasks among the processors. This
research simulates the behaviour of parallel machines on
SEQUENTIAL
ENVIRONMENT.
23.
HASSAN,
ANAH BIJIK
PGS/20002002/4040
64
LOAD
DISTRIBUTION IN
CLIENT-SERVER
IMPLEMENTATION
CASE STUDY:
MATERIALS
MANAGEMENT IN
STEYR NIGERIA.
2004
24.
BLAMAH,
NACHAMAD
PGS/20002002/4040
THE RELIABILITY
AND SECURITY
2004
sequential environment using java threads. Two parallel
machine models were simulated: The Single Program
Multiple, Data (SPMD) on arbitrary threads and Multiple
Program Multiple Data (MPMD) on three threads, which
are special cases of Single Instruction Multiple Data
(SIMD) and Multiple Instruction Multiple Data (MIMD).
The performance of these models were analysed and
compared with a sequential model. It was observed that
the runtime perfor-mance of the simulated SPMD
indicated parallelism with increase in number of
elements, and that of the MPMD was faster than the
sequential version by 80ms. The MPMD results can
further be improved by running programs with balanced
load.
In this work, client-server model is studied, and
described as a system in which clients make requests
while servers provide service. Design models have been
considered, thin client has been praised in theory as the
best. However, in practice, excess server load is
identified as a major disadvantage in thin client-server
implementation. This is because, the processing logic
.and data are deployed on centralized servers. Fat-single
executable (thick) client- server model is proposed to run
on distributed database system. Steyr Nigeria Materials
Management is considered as case study. Unified
modeling language (UML) approach is used in designing
the system. The proposed approach is implemented using
Java programming Language because of its objectoriented nature and also it provides remote procedure call
(RPC) in the form of remote method invocation (RMI).
Simulation results are obtained for some operations for
the thin and the thick client. They show that the proposed
approach performs better and thus gives a better response
time.
This research considered online voting system as a
critical system, where systems dependability is of
25.
A VACHAKU
54
ONLINE VOTING
SYSTEMS: A CASE
STUDY OF THE
INDEPENDENT
NATIONAL
ELECTORAL
COMMISSION
(INEC)
OLUSANYA,
O. MICHEAL
PGS/9899/404025
COMPUTER
APPLICATION OF
MARKOV CHAIN
MODEL ON
TURBERCULOSIS
OF THE LUNG.
2004
paramount importance. The most important dimensions
of dependability that were considered are reliability and
security. Two complementary approaches to systems
reliability were used: fault avoidance and fault tolerance.
Fault avoidance was ensured by the use of approaches to
systems development that minimize faults before systems
delivery. These include avoidance of dynamic memory
allocation, multiple generalization, pointers, interrupts,
recursion, and parallelism. Fault tolerance was achieved
by implementing mechanisms for defensive (preventive)
and retrospective fault detection, damage assessment, and
forwardlbackward fault recovery. Two main architectures
for fault tolerance implementation were considered.
These are the N- version programming architecture
(majority voting technique) and the recovery block
architecture (redundant spare of codes.) Systems security
was implemented by making each cast vote a function of
the voter identification number (VIN) and by encrypting
both the VIN and the vote, thereby making it impossible
to detect who the voter is and who is being voted for. The
unified modeling language (UML) was used for the
design of the system, while lava's robust nature made it
suitable for the implementation. Kiosk and remote online
voting systems with distributed servers were used to
implement both the reliability and security strategies. The
resulting software was tested using sample data drawn
from the case study and the results obtained were
satisfactory. A suitable software architecture for the
system with loosely coupled subsystems was finally
presented.
The computer application of Markov Chain model has
been developed to project the probability of incidence of
Tuberculosis (TB) in Bauchi State. Projections were
made under the assumption that current TB control
measures will remain unchanged for the periods. The
program was tested using TB cohort report from Bayara
26.
GOTENG,
LONGINUS
GOKOP
PGS/20002001/4040
52
INTERGRITY
CONSTRAINTS IN
DISTRIBUTED
DATABASE
SYSTEMS: A CASE
STUDY OF THE
BOARD OF
INTERNAL
REVENUE.
2004
General Hospital, Bauchi, Bauchi State. The result shows
the trend of TB incidence in 2001-2021 (of the
population used) and it was discovered that at the end of
the prediction, more than 80% of the population used
might be affected. More so, the equilibrium state was
checked with the same result and it was discovered that
in next 65 years the probability of the Normal & Disease
will tend to zero independently and Death will tends to
one (L) if nothing is done to control it. The program was
written in C++ and it can be used in any community to
predict the probability incidence of TB. It accepts
available data and prediction can be made.
Distributed systems offer more flexibility, scalability,
availability, reliability and speed than the old tightly
secured centralized systems. Serious problems inherent
with distributed systems such as security, update,
replication and consistency of databases at different
locations (sites) must be checked. This project explores
constraints that enforce the integrity (consistency) of
databases in distributed systems using semantic integrity
assertions. Four steps were used to accomplish this: 1)
accept input u, as an update; 2) send input to integrity
constraints checker, c; 3) if valid, continue the process
and send an output; 4) if invalid, keep it in the buffer for
constraint checker reference and proceed to the next
input. Built-in audit trail in operating system was used to
track malicious updates from any site. The
implementation of the integrity constraints using The
Board of Internal Revenue as a case study was
accomplished using Java programming language. This is
because Java has features that can handle distributed
programming more than most programming languages.
An object- oriented approach was adopted in the design
using the Universal Modeling Language (UML). Results
obtained from all the databases when updates were made
at all the sites in the same domain were consistent
27.
ESSIEN,
NSEABASI
PETER
PGS/20002001/4040
79
IMPROVING
SOFTWARE
CONSTRUCTION
PRODUCTIVITY
THROUGH
REUSABLE
COMPONENT
INTERGRATION
(MAINTENANCE
TOOL
COMPONENT)
2004
28.
OMEIZA,
CECILIA
OMENEKE
PGS/20002001/4040
72
COMPUTER
IMPLENTATION OF
SOME GRADIENT
TECHNIQUES IN
NON-LINEAR
OPTIMIZATION
2004
(uniform). This shows that the integrity constraint
applied is working.
Software Development involves designs of computer
codes to solve a specific task. It is observed that every
software development involving Database Management
(DBM) and Management Information System (MIS)
software as well must use data maintenance. That is file
update codes. From this it is observed that more than
60% of codes written in the development of Software are
actually duplicated. The key to solving this problem is in
the Software Reuse. It is 'not only the development time
that is saved when employing Software Reuse, the
quality, measured in term of reliability is equally favored.
In this project work codes for file creation, file
maintenance (update) and transmission/connectivity is
developed
christened "MAINTENANCE
TOOL
COMPONENT". This component serves about 77.0%
development time of Database Management Software.
That is with this component about 77.0% of codes meant
for the development of a standard database software in
the area of Payroll, Library System, Material
Management Information System are actually reused, not
only that, time spent on the Requirement Analysis,
Design and Testing stages are also significantly reduced.
This Maintenance Tool Component is developed using
JAVA programming language. Modeling and Design is
based on Unified Software Development Process (USDP)
that incorporated Unified Modeling Language (UML).
Software in Borland C++ programming language was
developed to implement the three popularly used gradient
techniques. namely, steepest descent method, conjugate
gradient method and variable metric (Quasi-Newton)
method for non - linear optimization. Eleven sample
problems were selected to test run the software. The
performances of these techniques were examined in
terms of accuracy, convergence rate, run-time, and
29.
MUHAMMAD, PGS/98LAWALI
99/404029
JABAKA
NUMERICAL
TREATMENT OF
SOME WAVE
EQUATIONS.
2004
30.
ROKO,
ABUBAKAR
SOFTWARE
PACKAGE FOR THE
NUMERICAL
SOLUTION OF
INITIAL VALUE
PROBLEMS IN
ORDINARY
DIFFERENTIAL
EQUATIONS.
2004
PGS/9899/404042
storage requirement. It was discovered in general that
steepest descent method required the smallest storage
size,but relatively slow to converge; conjugate gradient
method seems to he the most accurate. Variable metrics
was relatively fast to converge but used the biggest
storage size and high execution time. Useful
recommendations that will assist those who may wish to
implement the techniques were made.
In this project Finite Element Schemes for solving wave
equations were developed. These schemes were
implemented using three wave equations, namely,
Generalised of d'Alembert wave equation, one dimension
hyperbolic equation and telegraph equation for infinite
cable. Algorithms were designed and implemented using
Borland C++ Programming Language It was observed that
the results obtained from the schemes compared
favourably with existing analytic solutions especially in
the case of generalised d'Alernbert wave equation. In
general, the accuracy increases with decreasing mesh size
or increase in the number mesh points.
Users find it difficult when dealing with implicit RungeKutta methods, due the complexity of iterations and
numerical manipulations. In fact the equations defining
the stages are usually solved by iterative methods, for
instance Gauss-Elimination technique which is very
cumbersome. For this reason, this research project is
aimed at circumventing this well known difficult through
the development of software package. After systematic
study of some existing implicit Runge-Kutta methods and
Gauss-Elimination/Newton-Raphson
methods
we
designed an algorithm, which was intum, used to develop
software package for the simulation of general RungeKutta methods. The software so developed is capable of
gererating a large square matrix. Its reliability, accuracy
and stability were tested using various initial value
problems in ordinary differential equations and the
31.
CHIROMA,
IBRAHIM
NYAM
PGS/40400
9
GENETIC
ALGORITHM
APPROACH TO KNODE SET
RELIABILITY
OPTIMIZATION
FOR A
DISTRIBUTED
COMPUTING
SYSTEM SUBJECT
TO CAPACITY
CONSTRAINT.
2005
32.
ODION,
OSHIOKHAIM
HELE PHILIP
PGS/20002001/4040
49
A MODEL FOR THE
OPTIMIZATION OF
MILITARY
TRAINING (A CASE
STUDY OF ARMY
WING, NIGERIAN
DEFENCE
ACADEMY,
KADUNA)
2005
results obtained are in fact in good agreement with the
exact solutions.
One of the important issues in the design of a distributed
computing system (DCS) is reliability. The reliability of
a DCS depends on the reliability of its communication
links and nodes, as well as on the distribution of its
resources, such as programs and data files. In the
reliability analysis of a DCS, the term K-Node Reliability
(KNR) is the probability that all nodes in K (a subset of
all processing elements) are connected, where K denotes
a subset of a set of processing elements. A K-node set
reliability optimization with a capacity constraint is
normally a difficult problem and it is evident that it
belongs to the class of NP- hard problems. Therefore,
heuristic algorithms like genetic algorithms are potential
tools for obtaining optimal or near optimal solutions,
since they are domain-independent problem-solving
approach in which computer programs are evolved to
solve or approximately solve the problems. This project
research presents an efficient genetic algorithm that
computes the reliability of a subset of network nodes of a
DCS such that the reliability is maximized and specified
capacity constraint satisfied. Compared with existing
algorithms (Chen et al, 1994) on various DCS topologies,
the proposed GA finds a sub-optimal design much more
efficiently.
One persistent problem in military training is how to
make decisions concerning the proper mix of training
aids, conventional training resources, and resource
utilization to maintain training proficiency. Given the
proficiency standards, the model determines the
resources needed and the frequency with which each
method needs to be repeated, in order to maintain the
standards of proficiency at minimum cost. Data collected
from the training of army cadets was modeled using
linear programming. The two-phase method was applied
33.
FAMUTIMI,
RANTIOLA
FIDELIS
PGS/20002001/4040
58
A COMPARATIVE
STUDY OF
SEQUENTIAL AND
PARALLEL
PROGRAMMING IN
SOME MATRIX
OPERATIONS.
2006
34.
SANI, USMAN
PGS/9899/404026
A SOFTWARE
TOOL FOR
PADIATION
EXPOSURE
MONITORING. A
CASE STUDY OF
NATIONAL
HOSPITALM FOR
WOMEN AND
CHILDREN ABUJA.
2006
using the Pascal programming language to implement it.
The result obtained shows that the training cost reduced
drastically by 6.2%.
Most of the existing statistical packages used in scientific
computations employ sequential processing in their
implementation. That is, in solving a particular problem,
the sequential program performs some sequential
processing in a predetermined order until the whole data
have been processed. The sequential program used is
regarded as a process. Concurrent processing is a method
of processing, whereby the entire processing is further
grouped into smaller subgroups and these subgroups are
then processed in parallel. Concurrent processing
provides a way to organise processing that contains
relatively independent parts so as to process them at the
same time thereby reducing immensely the overall
processing time of the job. It also provides a way to make
use of multiple processors. In this way, each independent
part (or process) is assigned a processor and all parts are
made to communicate so that the individual part's result
can be assembled together to obtain the required overall
result. This thesis intends to make a comparative study of
sequential and parallel programming in some matrix
related operations using the response time (speed up) of
processing and the memory requirements as parameters.
The threat ionizing radiation poses to workers warrants
the operation of a personnel monitoring system. In this
project report a software tool was developed for a
comprehensive film badge services. This tool simplifies
the record keeping requirement for maintaining
information on individuals who may be exposed to
radiation in the cause of their working activities. Its also
provides a complete tractability of every potential
exposure to radiation that an individual may incur. The
system written with visual basic also records personal
information (Name, age, sex, address), exposure history
35.
ZIRRA,
PETER BUBA
PGS/20002001/4040
57
SOFTWARE
DEVELOPMENT
EFFORT
ESTIMATION
SYSTEM A CASE
STUDY OF
MICROFOCUS
NIGERIA LIMITED,
KADUNA.
2006
36.
IDRISSA,
DJIBO
PGS/40401
4
SIMULATION AND
DETECTION OF
‘SOFT ERRORS’
THROUGH
SOFTWARE FAULT
INJECTION AND
2006
and also produces as an output exposure reports of
personnel using the film badge.
Software effort estimation has posed serious challenges
to software developers. Therefore, software practitioners
need to understand the effort to be expended and the
schedule to be followed in order to make reasonable
estimate of resources, to strive to avoid underestimate
and overestimate of resources or the amount of time
required to deliver the product. Hence, this study
identified models of quantitative measures within
software product such as the Constructive Cost Model II
(COCOMO II) - Post Architecture, Function Points
(FPA) and Putnam-Software Life-cycle Models (SLIM)
for effort estimation. Primary and secondary sources of
data were used to obtain reliable information from the
Microfocus Nigeria Limited, Kaduna. A Visual Basic
programming, language was used for the implementation
of the three models. Results i obtained were analyzed. It
was found out that the Microfocus Nigeria Limited uses
expert judgment and analogue methods of effort
estimation, resulting to poor prediction of staff, time and
size. Based on the results of the implementation, some
recommendations were proffered to Microfocus Nigeria
Limited and other software developers, among others are,
that Microfocus Nigeria Limited should adopt the
COCOMO II for effort, time and staff estimate, FPA for
effective sizing of software project, take into
considerations factors affecting effort estimation, and
software practitioners should adopt a balanced model for
realistic estimate of resources.
A systematic approach for introducing data and code
redundancy into an existing program written using C++
has been fully described. The transformations aim at
making the program able to detect most ofthe soft-errors
(bit flip errors) arising in microprocessor-based digital
architectures as the consequence of the interaction with
Additional work should be
carried out towards the
definition of a new set of
rules, allowing trhe
reduction of the resulting
overhead (in terms of
SOFTWARE FAULT
TOLERANCE
TECHNIQUES.
37.
AJIE,
IKECHUKWU
JAMES
PGS/40401
3
SIMULATION OF
SOME MULTIPLE
CRITERIA
DECISION MAKING
PROBLEMS USING
FUZZY SET
APPROACH.
2006
38.
YUSUF,
MOHAMMED
AGETEGBA
PGS/20012002/4040
98
DATA SECURITY:
LAYERED
APPROACH
ALGORITHM
2007
radiation. The fault coverage obtained by the method, as
well as the slow-down and code size increase it causes,
were investigated. for the proposed transformation used
on all the selected benchmarks (bubble sort, matrix
multiplication, and Factorial routines), it has been Found
that the average increase in the source code size (in terms
of number of lines) was 1.82, the average i ncrease ill the
executable code size (in terms ofnumber of bytes) was
2.3 and the performance slow-down introduced by the
application of the rules was about 2.66.
Human beings have always been faced with the problem
of data processing and decision making based on the
data. This problem increases when it involves choosing
between alternatives based on multiple criteria. This
work used fuzzy set approach to study decision making
under multiple criteria. The level of commitment of
members of a social club and also the grading system or
students was studied. The resultant models were
implemented using C++ programming language. The
concept of fuzzy similarity degree was used to dcicrm i
nc total scores of a set of students. The result of the
simulation showed that it is very efficient in breaking ties
experienced when the normal method based on
percentage is used. The notion of membership function
was also used to arrive at a model that was used to
determine the level of commitment of members of a
social club. Two different weighting methods were used
to implement the model. The results of the
implementations showed that the decisions taken were
consistent with one another. It also showed that the level
of commitment of members could be determined
quantitatively.
In this study we adopted the method of Two-FactorAuthentication strategy. Two-Factor-Authentication is a
security process that confirms user identities using two
distinctive factors, that is something they have (this
memory and speed) at the
cost of slightly reduced fault
coverage capacity.
More research should also
be carried out to complete
the set of rules to cover all
the costructs of a high-level
language.
Further research work is
recommended in the
implementation of the
encryption program with
39.
ADEGOKE,
GBOLAGADE
KOLA
PGS/20012002/4040
119
COMPUTER
SIMULATION
PRODUCTION AND
OPERATIONS
MANAGEMENT (A
CASE STUDY OF
DEBO INDUSTRIES
LIMITED, LAGOS).
2008
40.
ABBA, SANI
PGS/20052006/4040
SIMULATION OF
PERSISTENT
2008
factor includes keys, cards, tokens and so on) and
something they know (such as Password and Pins),
except that in this case we will not use physical objects
like cards, token and physical keys, only encryption keys
and password that will be applied. By requiring
individuals to supply two different credentials will reduce
the risk of fraud and make our data or information more
secured. The Two-Factor-Authentication method
employed by this study combined Trigger-ChallengeResponse password algorithm and Data Encryption
Standard algorithm to provide double protection. The
software design thus, has provided two layers of
protection by authenticating individuals twice, there by
offering much resistance to single-factor attacks, because
user has to know both Key and Password.
This project report concerns simulation of production and
operations management as an essential activity in a
manufacturing sector. Production and operations
management is currently one of the most research areas
in the manufacturing sector of the economy. The reason
is that production and operations management facilitates
the effective and efficient accomplishment of production
objectives of an organization. Some mathematical and
statistical models were used to compute the expected
process duration, actual process duration, economic order
quantity whose purpose is to reduce costs, and machine
breakdown frequency whose purpose is to predict
machine breakdown and for planning and preventive
maintenance. Test data were gotten from Debo Industries
Limited, Lagos and were simulated. The results
generated were used to predict a real life situation and
have been presented and discussed. The language of
implementation of the model is Turbo Pascal due to its
capability as a scientific programming language.
This research work is an attempt to design a simulated
prototype to model the actual kernel operation of
other encryption algorithm
such as Triple DES (TDES)
or other such algorithm that
will provide significantly
higher level of security.
Further work to be carried
out on distributed Nano-
41.
ALI, ISA
IBRAHIM
371
NANO-BASED
SECURITY
OPERATING
SYSTEM KERNEL,
IN WINDOWS XP
ENVIRONMENT.
PGS/20052006/4040
375
ISLAMIC
INHERITANCE
SYSTEM USING
OBJECT ORIENTED
DEVELOPMENT
2008
Windows XP operating system and to evaluate the Kernel for fault containment
performance based on micro benchmark and kernel in distributed environment.
profiling performance metrics. A new prototype namely,
Persistent ano-Based Security Operating System Kernel
(P SOSK) was designed and implemented using
Microsoft Visual C++; Application Programming
Interface (API), and Unified Modeling Language (UML).
The simulated kernel competes with the Windows XP
kernel functions such as: handling hardware exceptions
and interrupts; scheduling; prioritizing; dispatching
threads; synchronizing execution across processors in a
multiprocessor environment; kernel security monitoring;
system initialization and startup (Windows Registry).
The prototype ran on an Intel IA-64 hardware platform
(Intel Pentium IV MMX Dual core 2.8GHZ processor)
Computer, running Microsoft windows XP professional
version 2007 Service Park 3 operating systemnumerical
experiments on random instances of the performance
metrics suggest that the P
NSOSK model performs satisfactorily in windows XP's
environment. The mutual exclusion primitives,
synchronization and concurrency problems were solved
using the model.
Islamic inheritance system using object oriented
development was focused with the view of producing a
computerized inheritance system. However, inheritance
is a very complicated issue in Islam, and as well it is a
serious trial and affliction on Muslims. Sharing justly is
not just a necessity but also an obligation if and only if
the judge wants to succeed in this world and even in the
hereafter. Quite a number of verses and few prophetic
traditions encourage Muslims to get hold on to this
knowledge and remind them the risk of loosing it. Also
investigated were, factors concerning risks loss when
Muslims deal with inheritance in accordance with their
own selfish interest and not in accordance with the
42.
ALIYU,
SULEIMAN
ONIMISI
PGS/200506/404037
0
DESIGNING AN
INTERACTIVE
ENVIRONMENT
FOR LEARNING
FRACTIONS IN
PRIMARY
MATHEMATICS
USING OBJECT
ORIENTED
SOFTWARE
ENGINEERING
2009.
rulings set out by their creator. Procedures for allocating
and allotting right fraction/ratio to each and every
heir/inheritor receives legal proportion based on the
verdict of our pious predecessors were developed
according to the Islamic injunction. Unified modeling
language was selected for the development, flow chart
for the simple design and Java programming language
was selected for the implementation. Database has been
created in order to assist in solving the problem manually
and it has been simulated for the implementation. The
output obtained, really achieved the desired goals which
has been shown in the result.
With the tremendous growth of the use of computers in
schools, sound research is needed on how to design
interactive learning environments that effectively help
children by promoting reflective cognition and better
learning. The research described in this thesis addresses
the following issues in designing interactive mathematics
learning environments for children: a) The user interrace
design should support children's learning of mathematical
concepts. b) Design features should be effective in
promoting reflective cognition and better learning. c) The
learning environment should be tailored to meet
children's affective needs. Psmath-fraction is an
interactive learning activity aimed at assisting elementary
school children in understanding fractions was
developed. Techniques such as visual feedback and
scaffolding were used in the design to promote reflective
cognition. The result of this research provides a
motivating learning environment and the features of the
software matched children's interests and are conducive
to children's enjoyment of the learning activity.
The current research
suggests some directions for
future research:
1. Research is needed to
develop guidelines and
principles on how to design
effective navigational
structures and sequencing of
activities for different
domains.
2.
Research is needed
to develop guidelines and
principles on how to design
different styles of interfaces
suitable for different types of
learners.
3.
Research is needed
to investigate how to
precisely assess users'
understanding of the domain
in order to provide more
targeted and adaptive
feedback.
4.
Research is needed
to investigate what types of
entertainment elements
promote children's
motivation without
distracting them from the
exploration of the
mathematical concepts.
43.
YAHAYA,
MOHAMMED
ONIMISI
PGS/0506/404036
6
DEVELOPMENT OF
SEQUENCE AND
SERIES ANALYSER
SOFTWARE: AN
INTEGRATED
SOFTWARE FOR
MATHEMATICAL
SEQUENCES AND
SERIES ANALYSIS
2009
44.
UMAR,
FATIMA
ZAMBUK
PGS/20052006/4040
374
COMPARATIVE
STUDIES OF
SOFTWARE
ARCHITECTURES
2009
45.
YA’U
ABDULSALA
PGS/20052006/4040
DESIGN AND
IMPLEMENTATION
2009
In this work, language design principles in computer
science and notions related to sequences in mathematics
were studied. The evolutionary prototyping design
method is used to develop a software called Sequence
and Series Analyser. It extended the features of
Sequencer and Series plotter by including features like
sequence generation, sequence analysis, special
sequences like Fibonacci sequence, sequences of partial
sums and series, sequence prediction and mathematical
induction.
The field of Information System is experiencing major
changes as the underlying technology moves from
centralized, host based computing to distributed, client
server computing. Computer transaction using client
server model are very common. Client server model
provides a convenient way to interconnect and gather
information that is distributed at different area. In this
thesis, a study of a number of common software
architecture styles upon which many systems are
currently based on was made. A web base application
software designed using software methodology was used
to measure the effectiveness, efficiency, maintainability,
re- usability and propagation delay-time performance of
Repository and two tier Client server Architecture. Client
server was found to be more efficient, effective, it has
high rate of component reusability and high
maintainability compared to repository.
The Mathematical Sciences Programme of Abubakar
Tarawa Balewa University, Bauchi, has been using
M GITAL
362
OF INTERNETBASED
INFORMATION
SYSTEM FOR
MOBILE DEVICES (
CASE STUDY OF
MATHEMATICAL
SCIENCES
PROGRAMME
ATBU BAUCHI)
46.
SALAMI,
HAMZA
ONORUOIZA
PGS/0506/404036
3
AN INTERGRATED
ASSEMBLY
LANGUAGE
SIMULATOR
(PENSIM)
2009
47.
SAMBO,
AMINA SANI
PGS/20012002/4040
110
COMPARATIVE
ANAYSIS OF
INTERNET
BROWSERS
2010
papers and ink since the inception or the University to
pass information to both Staff and Students of the
department. In this project, an Internet Based application
called Math _info for hand held devices was designed to
support the activities of mathematical Sciences
Programme. Basic activities such as viewing student
results, Lecture time table and the courses registered by
individual student in the programme were considered for
implementation in this application. The application also
contained the Profile of all Academic staff in the
programme. Staff can also access courses allocation for a
particular Semester.
A study of existing microprocessor simulators was made,
and their limitations were identified. These simulators do
not support 32- bit registers and instructions. The object
oriented analysis and design methodology was then used
to develop PENSIM (the Pentium Simulator). The
simulator was tested and found to be very reliable. It
extended some of the features of existing microprocessor
simulators such as number of registers and instructions
implemented. PENSIM also provided a more detailed
tutorial compared to those found in the existing
simulators.
As more and more services become available on the
Internet, the issue of fast and secured access to online
resources gains more importance. The growth of the
internet has encouraged a high number of people to
explore and take advantage of the World Wide Web
(www). The window to the World Wide Web is a web
browser, hence the development of various web browsers
in the market today. A comparative study of four web
browsers namely netscape navigator, mozilla firefox,
internet explorer and Opera was carried out. A web portal
which was fully developed and tested was used to
evaluate the performance of the four browsers. The
results revealed that mozilla fire fox and netscape
Based on the research, I
recommend that users should
have an idea about the kind
of site they want to visit and
based on that they should
use a specific browser to
optimize their browsing.
48.
CHAKA,
JOHN GYANG
PGS/20012002/4040
126
COMPARATIVE
STUDIES OF
ANTIVIRUS
SOFTWARES IN
NETWORK
SECURITY
2011
49.
TELLA,
MUHAMMAD
PGS/20012002/4040
91
SIMULATION OF A
GROUP DECISION
MAKING (GDM)
MODEL FOR
MANAGEMENT
DECISION
SUPPORT
2012
perform best in download time and page layout, Internet
explorer is best in conserving memory usuage, privacy
and security. Opera performs best in speed and
performance.
This thesis compares the performance of four (viz: AVG
Antivirus 9.0, Syrnantec Norton Antivirus 2010,
Kaspersky Internet Security 2010, and Avast Antivirus
5.0) of the most widely used antivirus software in order
to ascertain the most efficient and effective in handling
network threats. A comparison of the performances of
these antivirus software is needed to enable users make
informed choices in the use of any of the antivirus
software. AVG Antivirus 9.0 was found to be more
effective, efficient and has a high performance compared
to Syrnantec Norton Antivirus 2010, Kaspersky Internet
Security 2010, and Avast Antivirus 5.0.
Group Decision Making (GDM) is one of the approaches
that have gained wide acceptance in organisations. In
GDM problems, a set of experts are faced with the choice
of the best alternative from a set of many alternatives. A
GDM model with incomplete Fuzzy Preference Relations
was developed to overcome the problem of inconsistency
in incomplete fuzzy preferences. In this study, six laptop
computers with different specifications were used for the
simulation. It is expected that given any set of
alternatives, the software should be able to monitor the
consistency or inconsistency of an expert in giving his
preference value for the pair wise comparisons of the
alternatives and be able to suggest alternative preference
values for any given pair for which the expert cannot
provide a preference value. The research investigated the
reliability of the model when the number of alternatives
is too large to cope. The work is designed along two
different aspects: Interface Requirements, and Logical
goals with use of Java as the programming language in
the coding of the models.
Further research work can be
carried out simulate several
seenarios in which many
decision makers are
involved.
50.
BELLO,
MUHAMMAD
ALIYU
PGS/20002001/4040
69
SOFTWARE TOOL
FOR
IMPLEMENTATION
OF GAUSSLOBATTO
RUNGEKUTTA
METHODS
2012
51.
MAISHANU,
MARYAM
PGS/20012002/4040
89
AN INTERFACE
FOR A PC BASED
OSCILLOSCOPE
2011
The increasing gap between the speeds of processors and
main memory of a computer has led to hardware
architectures with an increasing number of caches to
reduce average memory access times. Such deep memory
hierarchies make the sequential and parallel efficiency of
computer programs strongly dependent on their memory
access pattern. In this thesis, we consider implicit GaussLobatto-Runge-Kutta collocation methods for the
solution of ordinary differential equations and study their
efficient implementation on different parallel platforms.
In particular we focus on systems of ordinary differential
equations which are characterized by large Lipschitz
constants. This class of equations is usually very stable
but often difficult to solve even numerically because of
their fast responding component which tends to control
the stability of numerical methods. Here we explore how
the potential parallelism in the stage vector computation
of such equations can be exploited in a pipelining
approachleading to a better locality behavior and higher
scalability. Experiments show that this approach results
in efficiency improvements on several recent sequential
and parallel computers.
A PC based oscilloscope was developed with appropriate
interface to an electronic device specifically the PC
sound card with its integrated ADC using MATLAB data
acquisition tool. The interface developed is easy to
understand and flexible enough to handle data acquisition
tasks with various configurations. The system was tested
and the results of sampling with different trigger settings
were observed by adjusting duration, timeout, trigger
repeat and threshold values for the advanced trigger. Data
exploration tools were used to explore data in various
ways after the signal has been captured. Other features of
the system tested include data export to an excel file,
generation of subset of the data using suplot panel and
how exported image could be displayed in a graphics
Further research in this area
should consider integrating
the MATLAB Signal
Processing Toolbox to
improve signal analysis and
measurement capabilities for
the software.
52.
UMOH,
NNAMSO
MICHAEL
PGS/0607/404039
6
SOFTWARE
RELIABILITY
LEVELS CASE
STUDIES OF SOME
SOFTWARE
COMPANIES.
2013
53.
MOHAMMED,
DANLAMI
PGS/20062007/4040
397
QUERY
OPTIMISATION
AND
PERFORMANCE
ANALYSIS ON WEB
2013
application. When put into use, the software could
contribute to learning and self development as users gain
insight into how MATLAB and DAQ toolbox are utilized
in developing applications that meet the requirements of
more specific data acquisition tasks.
Software reliability is a major factor in software quality
and is the index by which users judge a particular
software. Software reliability to the users is the ability of
the software to perform its perceived function without
fail. It therefore must be determined in all software
projects of importance. Some software reliability models
were reviewed and the Jelinski-Moranda model was
chosen which was considered the most appropriate for
the intended software tool (i.e Re1Soft). Some software
houses were visited and failure data collected, formatted
and analyzed. The least square method was used to
determine some parameters such as Et (i.e. total error)
and K, the proportionality constant. These parameters
were used as input to design Relsoft tool which was then
used in determining the reliability levels of the selected
software produced by Nigerian software companies. The
result obtained from RelSoft, showed that nine out of the
ten selected software have reliability levels ranging from
0.507 to 0.825, except one which has a reliability level of
0.425. Other reliability indicators such as failure rate and
MTTF (i.e. Mean Time To Failure) were also calculated
using RelSoft. The software reliability is measured on a
scale ranging from zero to one (i.e. 0 to 1). However, for
life-critical application the reliability must be in the
region of 0.999 to 1.0 for it to be considered to be
reliable.
Query Optimization is the process of choosing the most
efficient way to execute a Structured Query Language
(SQL) statement and web services is a software system
designed to support interoperable machine-to-machine
interaction over a network. These two technologies when
Further research on this field
should also consider trying
other programming language
for the implementation such
PHP and MySQL, Java
SERVER
integrated offer better performance of application by
eliminating machine-to-machine dependence. Using
sample sizes of 1000, 2000 and 3000 records, the thesis
uncovered that adding more memory to the system
improves the performance of the Relational Database
Management Systems (RDBMS). In another case,
changing Hard Disk Size has little effect on the
performance of RDBMS. Also, adding more data to the
database has effect on the performance of the RDBMS.
While changing the processor size from single core to
duo and quad core improve the performance of the
RDBMS. In all the cases stated above, SQL Server 2005
offer better performance followed by Microsoft Access
2003, Oracle 10g and My SQL 5.1 respectively. The
application is recommended for RDBMS developers.
Programming Language.
It is recommended that
future research on this area
should consider increasing
the sampling size. This is to
visibly eneble the researcher
see the effect of the
hardware change clearly.
Download