ArticlePDF Available

Design and Analysis of an Electronic Platform for Examinations

Authors:
Open Access Library Journal
2018, Volume 5, e4919
ISSN Online: 2333-9721
ISSN Print: 2333-9705
DOI:
10.4236/oalib.1104919 Nov. 23, 2018 1 Open Access Library Journal
Design and Analysis of an Electronic Platform
for Examinations*
Daniel Paul Godwin1, Modi Bala2, Abdulfatai Habib1
1Department of Computer Science, Faculty of Science, Kebbi State University, Birnin Kebbi, Nigeria
2Department of Computer Science, Faculty of Science, Gombe State University, Gombe, Nigeria
Abstract
An alternative approach for conducting tests of multi-
choice format, is the
e-Examination Platform or simply a Computer based Test (CBT). The CBT is
necessary considering the large population of
students enrolled in Nigerian
Secondary and Higher Institutions of learning. Furthermore, the student size
and the unique nature of some departmental courses hinder its total imple-
mentation. This paper investigates the challenges attributed to the manual
processes of conducting the said tests and examinations. The paper also ex-
amines the potential for using student feedback in the validation of assessments,
by attempting to minimize the difficulties associated with it. The
process was
further modified to run in parallel, making it faster, and helped to minimize
human errors, improve the accuracy of the CBT and produce quality and
transparency in the process. A survey of 230 students from various schools
was taken to test and sample their opinions, by filling
a carefully worded
questionnaire. The data collected was then collated and analyzed. The analy-
sis showed that more than 95% of the students surveyed were already compe-
tent in the use of computers and the CBT platform. Also, more than 90% of
them found the platform easy to navigate and use. Lastly, about 98% of them
said that the platform was a much better alternative to the manual process of
conducting the same tests and examinations.
Subject Areas
Software Engineering, Simulation of Software Functionality
Keywords
CBA, CBT, Algorithm, SQL, Dreamweaver
*CBTA Electronic Examinations Platform.
How to cite this paper:
Godwin, D.P.,
Bala, M
. and Habib, A. (2018)
Design and
Analysis of an Electronic Platform for
Examinations
.
Open Access Library Journal
,
5
: e4919.
https:
//doi.org/10.4236/oalib.1104919
Received:
September 18, 2018
Accepted:
November 20, 2018
Published:
November 23, 2018
Copyright © 201
8 by authors and Open
Access Library Inc
.
This work is licensed under the Creative
Commons Attribution International
License (CC BY
4.0).
http://creativecommons.org/licenses/by/4.0/
Open Access
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 2 Open Access Library Journal
1. Introduction
An e-Examination Platform or Computer-Based Assessment or Test (CBA or
CBT) is considered to be an e-assessment, and computer-administered testing
method for administering examinations. As such, media responses to questions
are been electronically recorded or assessed or in some cases both, as the case may
be. However, in this particular case, a standard pre-test-intervention-post-test de-
sign [1] was not utilized for testing, rather the CBT here implies the use of de-
vices such as: computers, cell phones, iPads etc. [2]. In recent times, CBTs have
been introduced as the new assessment platforms in some tertiary educational
institutions within Nigeria. The medium is considered to be faster and more ef-
ficient compared to the traditional paper-and-pencil approach [3] [4]. Some ter-
tiary educational institutions in Nigeria such as Gombe State University, Gombe
and a few selected ICT-compliant Secondary Schools in Gombe had tested the
software to that effect. This was achieved by sampling about 230 students at
random to have a trial run with the home-grown software discussed in this pa-
per. Other institutions such as: Kebbi State University, Kano State University,
University of Lagos, National Open University of Nigeria, Polytechnics and Col-
leges of Education have also introduced CBT for their yearly entrance examina-
tions. The said institutions started this form of examinations with the Post Uni-
versity Matriculation Examination (Post-UTME).
CBTs are also used for semester examinations especially where the classes and
number of students are too large. The system enables educators and trainers to
author, schedule, deliver and report surveys, quizzes, test and other types of ex-
aminations. CBTs could be built as stand-alone systems or partly as virtual learn-
ing environments, assessable through the World Wide Web (www). A collection
of tools are available on most CBT platforms, which enable automatic marking
of the question responses of multiple-choice formats.
A good example of a CBT platform is the Business Language Testing Service
(BULATS) developed and managed by the English Department of Cambridge
University, United Kingdom. The system is a highly sophisticated online test
platform that determines a candidate’s ability quickly and accurately using adap-
tive testing techniques. As a candidate progresses through the test, the computer
schedules progressive questions on the basis of the previous answers. This be-
comes progressively easier or more difficult until a consistent level of ability is
reached. Many candidates find the individual, non-distracted environment and
often the immediate score report feedback an attractive feature of the CBT sys-
tem. This paper discusses a CBT platform that puts into consideration unique
tools that improve upon existing platforms.
The remainder of this paper is organized thus: A brief summary of related
works is described in Section 2. Section 3 contains the background of the work,
and Section 4 defines how the proposed system is designed. Section 5 explains
the method used for data collection. Section 6 discusses the result, while Section
7 concludes the discussion.
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 3 Open Access Library Journal
2. Related Works
Researchers have more recently performed a large-scale review that examines the
differences between CBT and paper-based tests performance. The review showed
that, when CBT is similar in format to pencil and paper tests, it has little or no
effect on test performance according to [5]. From a student’s perspective [4] on
CBT, there have been a number of mixed reactions. Previous research has shown
that more people anticipated problems with the CBT.
This paper argues that the inexorable advance of technology will force funda-
mental changes in the format and content of assessment. However education
leaders in several states and numerous school districts are acting on that impli-
cation, implementing technology-based tests for low- and high-stakes decisions
in elementary and secondary schools and across all key content areas [6] which
will ensure standardization of testing procedures [7]. However, it is difficult to
accurately predict when this day will come. However, [3] has shown that CBT
has the ability to provide for expansions in the types of cognitive skills and
processes measured. Also, by allowing data to be collected during such examina-
tions, an accurate distinction is made between “omitted” and “not reached
items”, and the response latency is also collected. Furthermore, [3] also showed
that provision can also be made on a CBT system to produce large prints and
audio for vision-impaired examinees [8].
CBTs are economical, accurate and time bound. As such, primary, secondary
and tertiary institutions can adopt this system to solve challenges noted above.
Examination bodies such as the Joint Admission and Matriculation Board (JAMB)
in Nigeria have already adopted the system that carters for her examinations
across more than 500 CBT centers nationwide. This has aided to overcome chal-
lenges facing such examinations.
Another challenge facing CBT test designers and administrators is how to de-
sign and construct CBT software that is fair, reliable and capable of producing
valid test results. CBT Candidates find it difficult to navigate backwards to re-
work problems. Some are resistant to the computerized testing process because
they are accustomed to taking notes and circling question. Others say that they
read more quickly and more easily on paper [9] than on a glaring computer
screen [10]. As such, this paper has proposed a system that is quite capable of
handling the said problems and even better.
3. Background
A Logarithmic algorithm was used in implementing the said system, particularly
the modified Quick sort algorithm. The algorithm is based on the classic Di-
vide-and-Conquer approach. The process generates and processes a set of array
indexes that can run simultaneously to generate questions and answers auto-
matically.
Algorithm 1:
Generation of survey questions and answer.
Begin
Repeat
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 4 Open Access Library Journal
Display “Type in the letter chosen or No. 1-20 to finish”
DISPLAY “Question 1”
DISPLAY“A
:
Zero”
DISPLAY“B
:
Single double bound”
DISPLAY“C
:
More than one double bond
DISPLAY“D
:
Two double bonds
DISPLAY “Enter correct answer”
ACCEPT letter
If letter = ‘A’ then
Zero = Zero + 1
If letter = ‘B’ then
Single double bound = Single double bound + 1
If letter = ‘C’ then
More than one double bond = More than one double bond + 1
If letter = ‘D’ then
Two double bonds = Two double bonds + 1
UNTIL letter = ‘A’
DISLAY “Zero scored”, Zero, “wrong answer”
DISLAY “Single double bound scored”, Single double bound “wrong answer”
DISLAY “More than one double bond scored”, More than one double bond,
“correct answer”
DISLAY “Two double bonds scored”, Two double bonds, “wrong answer”
End loop
End Begin
Algorithm 2: Generation of Questions
begin
Step 1
:
Create an array A[ ] of N
Step 2
:
Generate a random number “rand”
Step 3
:
CA-GRF{
count==0
fori=0 to N
if(rand<N)
{
test(rand,i)
A[i]=rand
Count=count+1
}
else
return 0
}
int test(rand,i)
{
int j=0
for j=i+1 to N
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 5 Open Access Library Journal
if(A[j] != rand
return rand
else
{
Generate rand==A[j]
Return rand }
}
end begin
4. Proposed System Design Objectives
The main objective of this research paper is to terminate the manual process of
examinations called Pencil-Paper Test (PPT) into an automated system. Such a
system allows the user an interactive design and implementation called an online
computer based test system OCBTS. The Specific objectives of this research pa-
per are defined as follows:
1) To develop a CBT system that will automatically replace the manual form of
examination system.
2) To developed a CBT system that automatically generates examination questions.
3) To developed a CBT system that automatically generates examination num-
bers for students.
4) To ascertain the operational effectiveness of the system.
5) Introduce a means of training for students on or before the actual application
of the CBT system called Computer Assisted Learning (CAL).
6) To develop a CBT system with enhanced security features to avoid exam
malpractice.
7) To design a CBT system with real-time processing of results for candidates
automatically.
8) To allow for bursary and online payments.
5. Methodology
The methodology of this paper includes the use of Paper Format, Questionnaires
and methods used in the collection of data.
5.1. Research Questions
The following five research questions as seen on Table 1 were formulated to ad-
dress the problems identified in this study, namely:
1) What are the issues peculiar to the use CBT among the students?
2) What are the general constraints with the use of CBT for assessment of stu-
dent?
3) What are the effects of the test administration mode on student’s perfor-
mance
i.e.
students’ scores?
4) What is the relationship between prior computer experience and perfor-
mance in CBTs?
5) What practices are helpful to improve the perception about CBTs?
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 6 Open Access Library Journal
Table 1. Paper format.
Paper format
Timing
minutes
No. of questions
Task types
, e.g. writing a letter
In Task 2, candidates
problem.
Answering
not be seen by the examiner.
5.2. Software Specification Requirements
1) User Interface: PHP, XHTML, CSS, JQUERY
Client-side Scripting: JavaScript, PHP Scripting
Programming Language: PHP, ASP
IDE/Workbench/Tools: Adobe Dreamweaver CS6, NetBeans.
Database: MySQL (MySQLite, Optional, Oracle 10 g).
Server Deployment: Apache/2.2.4, Tomcat Apache
2) Hardware Specification Requirements:
Monitor: 17 inches LCD Screen (optional).
Processor: Pentium 3, 4, dual-core, Intel, Core i 7.
Hard Disk: 500 GB or 1 - 4 Terabyte.
RAM: 4 GB or more.
5.3. The System Design and Implementation
The system analysis process has a major component in which the system is de-
signed. These are: the administrative account panel (
i.e.
for registration and lo-
gin authentication), user login panel, question generation and pin-code or ap-
plication number generation.
The workflow of the system enables the user to easily understand the process
for generating questions. Furthermore, the system also provides a user with
graphical user interface (GUI) and a simple interactive interface to enter the de-
tails concerning: the generation of question
i.e.
input to the database. The system
administrator provides the users access to get registered for access rights to the
system (Figure 1).
The architecture of the system is design structurally, and constitutes three es-
sential parts. This includes the GUI, Front-End and Back-End modules.
1) The GUI defines the structural design and how the system looks like after
implementation, which details a unique and interactive platform that suites
user needs.
2) The Front-End (FE) comprises of everything concerning what the user sees, and
includes the design and types of programming languages used in the designing
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 7 Open Access Library Journal
Figure 1. System design architecture.
of the system. Such language include: PHP, PHPMYQL, HTML, Bootstrap AND
CSS etc. The Back-End (BE) otherwise called the Server-Side, deals with the sys-
tem inputs, retrieval, edit and updates. This refers to everything the user cannot
see in the browser such as the database and servers used.
6. Results and Discussion
The test-plan is basically a list of tests. A convenient sample of about 50 ques-
tions per students was taken on the developed CBT test. Afterwards, a survey
and questionnaire was used for data collection. The data analysis demonstrated
auspicious characteristics of the target context for the CBT implementation. A
few were not in favor of it, due to the impaired validity of the test administration
which they reported having some erroneous formulas, equations and structures
in the test items. The test process deployed provides immediate scoring, speed
and transparency in marking. It is important to also note that the test cases cover
all the aspects of the question generating system.
Description of Test Results
Tables 2-6 and Figures 2-16 demonstrate the results of the survey carried on
230 students in total. This was conducted by the administrator of the platform.
The questions and answers were carefully worded, selected and formulated to
ascertain their level of CBT competence/awareness, ease of use of platform and
preference over the manual process of examinations.
The three aforementioned levels of questioning were each ranked on the scale
of 1 - 3 as:
not too well
(1),
well
(2) and
very well
(3). The idea is to compare the
three levels of questioning, bearing in mind that
age
and
gender
could be of
great significance too. The population of the students concerned was well above
2000 in number as such a random sample of an average of 57 students per school
was a good sample (
i.e.
roughly 10%) was used to estimate the general opinion of
the students. The students were also randomly passing by while being issued
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 8 Open Access Library Journal
Table 2. Survey of 1 - 50 students for determining the performance of the CBT platform
for Gombe State University, Gombe-Nigeria.
S/N Age Gender
Level of CBT
Competence/
Awareness
Level of Ease
of Use/Navigation
Level of Preference
of CBT and Readiness
to Acquire Same over the
Manual System
1 22 M 3 1 2
2 19 F 3 3 3
3 20 F 3 3 3
4 18 F 3 2 3
5 17 M 3 3 3
6 21 M 3 3 3
7 20 M 2 3 3
8 22 M 3 3 3
9 20 F 3 3 3
10 19 F 2 3 3
11 18 M 3 3 3
12 17 M 3 3 3
13 18 M 3 3 3
14 23 M 3 2 3
15 20 F 3 3 3
16 21 M 3 3 3
17 19 F 2 3 3
18 24 M 3 3 3
19 25 F 1 3 3
20 18 M 3 3 3
21 17 F 3 3 3
22 19 F 3 3 3
23 19 M 3 3 3
24 23 M 3 3 3
25 18 F 3 3 3
26 20 M 3 1 3
27 24 M 3 3 3
28 22 F 2 3 3
29 20 F 3 3 3
30 21 F 3 3 3
31 18 F 3 3 3
32 17 F 2 3 2
33 19 F 3 2 3
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 9 Open Access Library Journal
Continued
34 20 F 3 3 3
35 23 M 3 3 3
36 20 M 3 3 3
37 23 F 3 3 3
38 20 M 3 3 3
39 18 M 2 3 3
40 18 F 3 3 3
41 21 F 2 3 3
42 20 M 3 3 3
43 22 M 3 3 3
44 20 F 2 3 2
45 19 M 3 3 3
46 18 M 1 3 3
47 17 F 3 3 3
48 18 M 3 3 3
49 22 F 3 3 3
50 21 F 3 3 3
Table 3. Survey of 1 - 80 students for determining the performance of the CBT platform
for Matrix International Academy, Gombe-Nigeria.
S/N Age Gender
Level of CBT
Competence/
Awareness
Level of Ease
of Use/Navigation
Level of Preference of CBT and
Readiness to Acquire Same
over the Manual System
1 11 M 3 1 2
2 13 M 2 3 3
3 15 F 3 2 3
4 11 M 3 2 3
5 13 M 2 3 2
6 13 F 3 3 3
7 11 F 2 2 3
8 14 M 3 1 3
9 15 F 3 3 2
10 12 M 2 2 3
11 16 F 1 1 3
12 12 M 2 3 3
13 13 F 3 1 3
14 12 M 3 2 3
15 11 F 2 3 3
16 15 F 3 2 3
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 10 Open Access Library Journal
Continued
17 14 F 2 3 3
18 11 M 3 2 3
19 12 F 1 3 3
20 11 M 2 1 3
21 15 F 3 3 3
22 16 F 3 1 3
23 14 F 2 3 3
24 12 M 3 2 3
25 15 M 1 3 3
26 14 M 2 1 3
27 15 M 3 1 3
28 16 F 2 3 3
29 11 M 3 3 3
30 14 M 2 2 3
31 13 F 3 2 3
32 13 F 2 3 2
33 15 M 1 2 2
34 16 F 2 1 3
35 17 F 3 2 2
36 12 M 2 2 3
37 14 F 2 3 3
38 13 M 3 2 2
39 11 F 2 3 3
40 12 F 1 2 3
41 15 F 2 2 3
42 13 M 3 2 2
43 14 F 3 2 3
44 13 M 2 3 2
45 11 F 3 3 3
46 15 M 1 1 3
47 16 F 3 2 3
48 14 F 2 1 3
49 12 F 3 1 3
50 12 M 3 3 2
51 15 F 2 3 2
52 15 M 2 3 2
53 13 F 3 3 2
54 12 F 2 2 3
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 11 Open Access Library Journal
Continued
55 16 F 3 2 2
56 13 M 3 2 3
57 11 F 2 3 2
58 13 F 3 2 2
59 15 M 1 2 3
60 16 F 1 3 3
61 15 M 3 2 2
62 12 F 2 3 2
63 15 F 3 3 2
64 16 M 3 2 3
65 12 F 2 1 2
66 13 F 3 3 3
67 14 M 2 2 3
68 14 F 2 3 2
69 13 M 3 2 3
70 12 M 2 3 2
71 15 M 3 2 3
72 14 F 1 1 3
73 16 M 2 3 2
74 12 F 1 1 3
75 12 F 2 1 3
76 14 M 3 1 2
77 15 M 2 3 3
78 13 F 3 2 3
79 16 F 1 3 2
80 12 M 2 1 3
Table 4. Survey of 1 - 60 students for determining the performance of the CBT plat form
for Yahaya Ahmed Schools, Gombe-Nigeria.
S/N Age Gender
Level of CBT
Competence/
Awareness
Level of Ease
of Use/Navigation
Level of Preference of CBT
and Readiness to Acquire
Same over the Manual System
1 13 F 3 1 2
2 15 F 2 3 2
3 13 F 2 3 2
4 15 F 2 2 1
5 11 F 3 3 2
6 14 F 1 3 2
7 12 F 2 3 3
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 12 Open Access Library Journal
Continued
8 12 F 3 3 2
9 11 F 3 3 3
10 11 F 2 3 2
11 15 F 3 3 3
12 16 F 2 1 3
13 15 F 3 3 3
14 13 F 2 2 3
15 14 F 2 3 3
16 11 F 3 3 3
17 12 F 2 3 2
18 12 F 3 3 3
19 13 F 2 3 2
20 13 F 1 3 3
21 16 F 3 3 3
22 14 F 1 3 3
23 11 F 3 3 2
24 13 F 2 3 2
25 14 F 2 3 2
26 11 F 3 1 2
27 13 F 2 3 3
28 13 F 3 3 3
29 12 F 2 3 2
30 12 F 2 2 2
31 11 F 3 3 3
32 16 F 2 3 2
33 14 F 2 2 3
34 14 F 2 3 3
35 13 F 1 2 2
36 11 F 3 3 2
37 15 F 3 2 2
38 16 F 3 2 2
39 12 F 2 3 2
40 13 F 3 3 3
41 16 F 2 2 3
42 11 F 3 2 2
43 11 F 3 2 3
44 15 F 2 3 2
45 12 F 3 2 3
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 13 Open Access Library Journal
Continued
46 12 F 1 3 3
47 13 F 3 2 2
48 12 F 1 2 2
49 16 F 3 2 3
50 14 F 2 2 3
51 13 F 3 2 3
52 13 F 2 2 3
53 11 F 3 3 2
54 16 F 3 2 3
55 13 F 2 2 3
56 12 F 3 3 3
57 15 F 1 3 3
58 14 F 1 2 3
59 11 F 3 2 3
60 11 F 3 3 2
Table 5. Survey of 1 - 40 students for determining the performance of the CBT platform
for Gombe High School, Gombe-Nigeria.
S/N Age Gender
Level of CBT
Competence/
Awareness
Level of Ease
of Use/Navigation
Level of Preference of CBT and
Readiness to Acquire Same
over the Manual System
1 11 M 1 1 2
2 11 F 2 3 3
3 14 F 3 1 3
4 16 M 3 2 3
5 12 M 3 3 3
6 12 M 3 3 2
7 14 F 2 2 3
8 15 F 3 3 3
9 11 M 1 2 3
10 11 M 3 3 3
11 14 F 3 2 2
12 12 F 2 3 3
13 11 M 3 3 3
14 15 F 3 2 2
15 13 F 3 3 3
16 14 F 3 3 3
17 13 M 1 3 2
18 14 F 3 3 3
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 14 Open Access Library Journal
Continued
19 11 M 3 3 3
20 12 F 3 2 3
21 13 F 3 2 3
22 15 M 3 3 3
23 15 F 2 2 3
24 12 F 3 3 3
25 13 F 3 3 3
26 16 F 2 2 2
27 15 M 2 2 3
28 16 M 3 2 2
29 11 F 3 3 3
30 13 F 3 3 3
31 13 M 3 3 3
32 12 F 1 2 2
33 15 M 3 2 2
34 14 F 2 3 3
35 14 M 2 3 3
36 15 F 3 2 2
37 14 M 3 3 3
38 13 M 2 3 3
39 11 M 3 3 3
40 12 F 3 3 3
Table 6. Survey of 1 - 50 students for determining the performance of the CBT platform
for Gombe International School, Gombe-Nigeria.
S/N Age Gender
Level of CBT
Competence/
Awareness
Level of Ease
of Use/
Navigation
Level of Preference of CBT and
Readiness to Acquire Sa
me over
the Manual System
1 14 F 3 1 2
2 11 F 3 3 3
3 11 F 2 3 3
4 14 F 1 2 3
5 12 M 3 3 3
6 12 M 3 3 3
7 15 M 3 3 3
8 16 M 3 3 3
9 15 F 3 3 3
10 15 M 1 3 2
11 13 F 3 2 3
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 15 Open Access Library Journal
Continued
12 13 F 3 3 3
13 14 F 2 3 3
14 13 M 2 2 2
15 15 M 3 3 3
16 15 M 3 2 3
17 13 M 2 2 3
18 13 F 3 3 3
19 12 M 1 3 3
20 14 M 3 3 3
21 15 M 3 2 3
22 16 F 3 3 3
23 14 F 1 2 2
24 13 F 3 2 3
25 13 F 3 3 3
26 12 M 3 2 3
27 15 M 3 2 2
28 11 M 2 3 3
29 11 M 1 2 2
30 13 F 3 2 3
31 11 F 3 3 3
32 15 F 2 3 2
33 16 M 3 2 3
34 13 M 1 2 3
35 16 F 2 3 2
36 12 M 3 3 3
37 12 M 1 2 2
38 14 M 3 3 3
39 11 F 2 3 3
40 11 M 2 2 2
41 12 F 2 3 2
42 16 F 2 2 3
43 14 F 3 3 3
44 14 M 2 3 2
45 13 M 3 3 3
46 16 M 1 2 2
47 15 F 3 3 3
48 15 M 3 3 3
49 11 F 2 2 3
50 12 F 2 3 2
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 16 Open Access Library Journal
Figure 2. Age vs level of competence, ease of use and preference.
Figure 3. Comparison between the levels of competence, ease of use
and preference.
Figure 4. The age distribution of surveyed students.
Figure 5. Age vs level of competence, ease of use and preference.
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 17 Open Access Library Journal
Figure 6. Comparison between the levels of competence, ease of use
and preference.
Figure 7. The age distribution of surveyed students.
Figure 8. Age vs level of competence, ease of use and preference.
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 18 Open Access Library Journal
Figure 9. Comparison between the levels of competence, ease of use
and preference.
Figure 10. The age distribution of surveyed students.
Figure11. Age vs level of competence, ease of use and preference.
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 19 Open Access Library Journal
Figure 12. Comparison between the levels of competence, ease of use
&preference.
Figure 13. The age distribution of surveyed students.
Figure 14. Age vs level of competence, ease of use and preference.
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 20 Open Access Library Journal
Figure 15. Comparison between the levels of competence, ease of use &
preference.
Figure 16. The age distribution of surveyed students.
with invitations to attend the trial test of the developed CBT platform using a
demo test. In [11] [12] the instrument of the study was conducted by four tests,
measuring problem solving, inductive reasoning, working memory and creativi-
ty. Also, a questionnaire was designed to focus on participants’ demographic
data, learning strategies, and ICT familiarity. However, in this paper, the stu-
dents after sitting for the demo test, their level of CBT competence/awareness,
level of ease of use/navigation and the level of their preference of CBT or the
manual process of examinations using pen-and-paper approach From Table 2
and Figure 2 representing the 50 students selected from Gombe State Universi-
ty, Gombe, it can be clearly seen that age varied among the students, with about
85% of the students being very competent or at least very aware of what and how
a CBT platform works. Also, about 90% of them said it was easy to navigate
around while using the platform. Lastly, about 95% of them preferred the CBT
platform in comparison to the conventional and manual process of conducting
tests and examinations. Figure 3 shows that that more than 72% of them were
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 21 Open Access Library Journal
generally satisfied on all three levels, while Figure 4 from the same institution
highlights the variations between the ages of the students. It can be seen that
more than 66% of them above the age of 19 years as seen in Figure 2 and Figure
4. This shows that the older they get, the more experience they get with CBT ap-
plication. However, the statistics seem to portray a slightly different analysis with
secondary students. The students are a little younger in age compared to their
higher institution counterparts as seen in Figure 2.
From Table 3 and Figure 5 representing Matrix International Academy had
80 students that were surveyed during their test. According to the figure, the age
distribution varied among the students, with about 70% of them paced above 13
years as seen in Figure 5 and Figure 7. More than about 88% of the students
were very competent and very aware of how a CBT platform works. Also, about
20% of them did not find it easy to navigate around while using the platform.
Furthermore, about 80% the students preferred the CBT platform in comparison
to the conventional and manual process of conducting tests and examinations.
Figure 6 shows that more than 80% of the students were generally satisfied with
the level of their CBT awareness, and ease of navigation and preference of CBT
over the manual process of examinations.
From Table 4 and Figure 8 representing Yahaya Ahmed Schools had 60 stu-
dents that were surveyed during their test. Accordingly, the age distribution va-
ried among the students with about 60% of them paced above 13 years as seen in
Figure 8 and Figure 10. More than about 85% of the students were very compe-
tent and very much aware of how a CBT platform works. Also, only about 6% of
them did not find it easy to navigate around while using the platform. Further-
more, about 80% the students preferred the CBT platform in comparison to the
conventional and manual process of conducting tests and examinations. Figure
9 shows that more than 96% of the students were generally satisfied with the lev-
el of their CBT awareness, and ease of navigation and preference of CBT over
the manual process of examinations. The branch of the school that was surveyed
is female-only as seen in Table 4.
From Table 5 and Figure 11 representing Gombe High School had 40 stu-
dents that were surveyed during their test. The age distribution also varied
among the students with about 62% of them paced above 13 years as seen in
Figure 11 and Figure 13. More than about 90% of the students were very com-
petent and very much aware of how a CBT platform works. Also, only about 5%
of them did not find it easy to navigate around while using the platform. Fur-
thermore, about 98% the students preferred the CBT platform in comparison to
the conventional and manual process of conducting tests and examinations.
Figure 12 shows that more than 95% of the students were generally satisfied
with the level of their CBT awareness, and ease of navigation and preference of
CBT over the manual process of examinations.
From Table 6 and Figure 14 representing Gombe High School had 50 stu-
dents that were surveyed during their test. The distribution of their age also va-
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 22 Open Access Library Journal
ried among the students with about 32% of them placed between 14 - 16 years as
seen in Figure 14 and Figure 16, which shows that a majority of the students are
younger. Furthermore, only about 16% of the students were not competent and
aware of how a CBT platform works. Also, only about 2% of them did not find it
easy to navigate around while using the platform. Generally, almost all the stu-
dents preferred the CBT platform in comparison to the conventional and ma-
nual process of conducting tests and examinations as seen in Figure 15.
7. Conclusions
This paper presented a Logarithmic Quicksort algorithm for solving a highly
constrained questions and answer CBT system. The approach used a prob-
lem-specific domain representation, with Stochastic and context-based reason-
ing for obtaining feasible results to questions answered by a student within a
reasonable time.
The system makes use of a procedural method and processes in generating
questions and answers automatically. This made it easier to understand and
quickly process. The system completely eliminates the manual process of writing
examinations using the conventional pen and paper style.
The system automatically handles all the other time consuming processes of
generating exams scores and the setting up of examination questions such that
over time, a considerable databases of questions can easily be queried and reset
at random for an arbitrary number of sessions. By implication, this means that
the issue of acquiring more infrastructures to accommodate more students is not
required. This is because every session for the same examination will be scram-
bled and reset for the next set of students sitting for the same test of examina-
tion.
Future work will entail real-time generation, content analysis and reports to
be generated for management information system (MIS) purposes. The work
will also include some aspect of machine learning, particularly using Recurrent
Neural Network (RNN) to determine and predict the level of collisions that may
likely occur, whenever fresh questions are scrambled.
Acknowledgements
We acknowledge our creator for the knowledge passed down. Our families,
friends, and colleagues are all appreciated as well for their effort. The open
source software communities are well acknowledged as well for providing readi-
ly available free downloadable software, particularly: PHP, XHTML, CSS, JQuery
etc. for interface design, and Apache/2.2.4, Tomcat Apache etc. for data service
deployment. This has made research a lot easier now.
Conflicts of Interest
The authors declare no conflicts of interest regarding the publication of this pa-
per.
D. P. Godwin et al.
DOI:
10.4236/oalib.1104919 23 Open Access Library Journal
References
[1] Baker, R.S., DMello, S.K., Rodrigo, M.M.T. and Graesser, A.C. (2010) Better to Be
Frustrated Than Bored: The Incidence, Persistence, and Impact of Learners’ Cogni-
tive-Affective States during Interactions with Three Different Computer-Based
Learning Environments.
International Journal of Human-Computer Studies
, 68,
223-241. https://doi.org/10.1016/j.ijhcs.2009.12.003
[2] Lee, K.S.K., Wilson, S., Perry, J., Room, R., Callinan, S., Assan, R., Hayman, N.,
Chikritzhs, T., Gray, D., Wilkes, E., Jack, P. and Conigrave, K.M. (2018) Developing
a Tablet Computer-Based Application (‘App’) to Measure Self-Reported Alcohol
Consumption in Indigenous Australians.
BMC Medical Informatics and Decision
Making
,
18, 8. https://doi.org/10.1186/s12911-018-0583-0
[3] Cynthia, G.P., Judith, A.S., John, C.K. and Tim, D. (2002) Practical Considerations
in Computer-Based Testing. Considerations in Computer-Based Testing. Sprin-
ger-Verlag, New Jersey, 1-2.
[4] Nugroho, R.A., Kusumawati, N.S. and Ambarwati, O.C. (2018) Students Perception
on the Use of Computer Based Test. IOP Conference Series.
Materials Science and
Engineering
, 306, Article ID: 012103.
https://doi.org/10.1088/1757-899X/306/1/012103
[5] Darrell, L.B. (2003) The Impact of Computer-Based Testing on Student Attitudes
and Behaviour. The Technology Source Archives, University of North Carolina,
USA.
http://technologysource.org/article/impact_of_computerbased_testing_on_student_
attitudes_and_behavior/
[6] Bennett, R.E. (2002) Inexorable and Inevitable: The Continuing Story of Technolo-
gy and Assessment.
Journal of Technology
,
Learning
,
and Assessment
, 1, 1-2.
https://ejournals.bc.edu/ojs/index.php/jtla/article/download/1667/1513
[7] Dembitzer, L., Zelikovitz, S. and Kettler, R.J. (2018) Designing Computer-Based
Assessments: Multidisciplinary Findings and Student Perspectives.
International
Journal of Educational Technology
, 4, 20-31.
https://educationaltechnology.net/ijet/index.php/ijet/article/view/47
[8] Prastikawat, F.A. and Huda, A. (2018) The Effect of Computer-Based Image Series
Media toward Description Writing Skills of a Student with Intellectual Dissability in
the Grade VII SMPLB.
Journal of ICSAR
, 2, 52-56.
[9] Lariana, R. and Wallace, P. (2002) Paper-Based versus Computer-Based Assess-
ment: Key Factors Associated with the Test Mode Effect.
British Journal of Educa-
tion Technology
, 33, 593-602. https://doi.org/10.1111/1467-8535.00294
[10] Ridgeman, B., Lennon, M.L. and Jackenthal, A. (2001) Effects of Screen Size, Screen
Resolution, and Display Rate on Computer-Based Test Performance.
ETS Research
Report Series
, 2001, 1-23.
[11] Butcher, J.N., Perry, J.N. and Atlis, M.M. (2000) Validity and Utility of Comput-
er-Based Test Interpretation.
Psychological Assessment
, 12, 6-18.
https://doi.org/10.1037/1040-3590.12.1.6
[12] Wu, H. and Molnár, G. (2018) Computer-Based Assessment of Chinese Students’
Component Skills of Problem Solving: A Pilot Study.
Literacy
, 1, 5.
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
One of obstacles faced by them is writing descriptive text. One way to improve descriptive writing skills for a student with intellectual dissability is the use of computer-based image series media. The purpose of this study was to writing skills of a student with intellectual dissability before and after using the computer-based image series media, and to describe the effect of the use of those media towards description writing skills of a student with intellectual dissability. This study used an experimental research with quasi experimental design. The result showed that students’ descriptive writing after used computer-based image series media shows an increasing
Article
Full-text available
Teaching nowadays might use technology in order to disseminate science and knowledge. As part of teaching, the way evaluating study progress and result has also benefited from this IT rapid progress. The computer-based test (CBT) has been introduced to replace the more conventional Paper and Pencil Test (PPT). CBT are considered more advantageous than PPT. It is considered as more efficient, transparent, and has the ability of minimising fraud in cognitive evaluation. Current studies have indicated the debate of CBT vs PPT usage. Most of the current research compares the two methods without exploring the students' perception about the test. This study will fill the gap in the literature by providing students' perception on the two tests method. Survey approach is conducted to obtain the data. The sample is collected in two identical classes with similar subject in a public university in Indonesia. Mann-Whitney U test used to analyse the data. The result indicates that there is a significant difference between two groups of students regarding CBT usage. Student with different test method prefers to have test other than what they were having. Further discussion and research implication is discussed in the paper.
Article
Full-text available
Background The challenges of assessing alcohol consumption can be greater in Indigenous communities where there may be culturally distinct approaches to communication, sharing of drinking containers and episodic patterns of drinking. This paper discusses the processes used to develop a tablet computer-based application (‘App’) to collect a detailed assessment of drinking patterns in Indigenous Australians. The key features of the resulting App are described. Methods An iterative consultation process was used (instead of one-off focus groups), with Indigenous cultural experts and clinical experts. Regular (weekly or more) advice was sought over a 12-month period from Indigenous community leaders and from a range of Indigenous and non-Indigenous health professionals and researchers. ResultsThe underpinning principles, selected survey items, and key technical features of the App are described. Features include culturally appropriate questioning style and gender-specific voice and images; community-recognised events used as reference points to ‘anchor’ time periods; ‘translation’ to colloquial English and (for audio) to traditional language; interactive visual approaches to estimate quantity of drinking; images of specific brands of alcohol, rather than abstract description of alcohol type (e.g. ‘spirits’); images of make-shift drinking containers; option to estimate consumption based on the individual’s share of what the group drank. Conclusions With any survey platform, helping participants to accurately reflect on and report their drinking presents a challenge. The availability of interactive, tablet-based technologies enables potential bridging of differences in culture and lifestyle and enhanced reporting.
Article
Full-text available
Computers have been important to applied psychology since their introduction, and the application of computerized methods has expanded in recent decades. The application of computerized methods has broadened in both scope and depth. This article explores the most recent uses of computer-based assessment methods and examines their validity. The comparability between computer-administered tests and their pencil-and-paper counterparts is discussed. Basic decision making in psychiatric screening, personality assessment, neuropsychology, and personnel psychology is also investigated. Studies on the accuracy of computerized narrative reports in personality assessment and psychiatric screening are then summarized. Research thus far appears to indicate that computer-generated reports should be viewed as valuable adjuncts to, rather than substitutes for, clinical judgment. Additional studies are needed to support broadened computer-based test usage.
Book
Considerations in Computer-Based Testing * Issues in Test Administration and Development * Examinee Issues * Software Issues * Issues in Innovative Item Types * Computerized Fixed Tests * Automated Test Assembly for Online Administration * Computerized Adaptive Tests * Computerized Classification Tests * Item Pool Evaluation and Maintenance * Comparison of the Test Delivery Methods
Article
We study the incidence (rate of occurrence), persistence (rate of reoccurrence immediately after occurrence), and impact (effect on behavior) of students’ cognitive–affective states during their use of three different computer-based learning environments. Students’ cognitive–affective states are studied using different populations (Philippines, USA), different methods (quantitative field observation, self-report), and different types of learning environments (dialogue tutor, problem-solving game, and problem-solving-based Intelligent Tutoring System). By varying the studies along these multiple factors, we can have greater confidence that findings which generalize across studies are robust. The incidence, persistence, and impact of boredom, frustration, confusion, engaged concentration, delight, and surprise were compared. We found that boredom was very persistent across learning environments and was associated with poorer learning and problem behaviors, such as gaming the system. Despite prior hypothesis to the contrary, frustration was less persistent, less associated with poorer learning, and did not appear to be an antecedent to gaming the system. Confusion and engaged concentration were the most common states within all three learning environments. Experiences of delight and surprise were rare. These findings suggest that significant effort should be put into detecting and responding to boredom and confusion, with a particular emphasis on developing pedagogical interventions to disrupt the “vicious cycles” which occur when a student becomes bored and remains bored for long periods of time.
Article
This investigation seeks to confirm several key factors in computer-based versus paper-based assessment. Based on earlier research, the factors considered here include content familiarity, computer familiarity, competitiveness, and gender. Following classroom instruction, freshman business undergraduates (N = 105) were randomly assigned to either a computer-based or identical paper-based test. ANOVA of test data showed that the computer-based test group outperformed the paper-based test group. Gender, competitiveness, and computer familiarity were NOT related to this performance difference, though content familiarity was. Higher-attaining students benefited most from computer-based assessment relative to higher-attaining students under paper-based testing. With the current increase in computer-based assessment, instructors and institutions must be aware of and plan for possible test mode effects.
Article
This paper argues that the inexorable advance of technology will force fundamental changes in the format and content of assessment. Technology is infusing the workplace, leading to widespread requirements for workers skilled in the use of computers. Technology is also finding a key place in education. This is occurring not only because technology skill has become a workplace requirement. It is also happening because technology provides information resources central to the pursuit of knowledge and because the medium allows for the delivery of instruction to individuals who couldn't otherwise obtain it. As technology becomes more central to schooling, assessing students in a medium different from the one in which they typically learn will become increasingly untenable. Education leaders in several states and numerous school districts are acting on that implication, implementing technology-based tests for low- and high-stakes decisions in elementary and secondary schools and across all key content areas. While some of these examinations are already being administered statewide, others will take several years to bring to fully operational status. These groundbreaking efforts will undoubtedly encounter significant difficulties that may include cost, measurement, technological-dependability, and security issues. But most importantly, state efforts will need to go beyond the initial achievement of computerizing traditional multiple-choice tests to create assessments that facilitate learning and instruction in ways that paper measures cannot.
The Impact of Computer-Based Testing on Student Attitudes and Behaviour. The Technology Source Archives
  • L B Darrell
Darrell, L.B. (2003) The Impact of Computer-Based Testing on Student Attitudes and Behaviour. The Technology Source Archives, University of North Carolina, USA. http://technologysource.org/article/impact_of_computerbased_testing_on_student_ attitudes_and_behavior/