ChapterPDF Available

Selection and Modeling of a Formal Heuristic Evaluation Process Through Comparative Analysis

Authors:

Abstract and Figures

Due to the importance of usability, multiple usability evaluation methods have been proposed that help Human-Computer Interaction (HCI) specialists determine whether the interfaces of a software product are usable, easy to use, understandable, and attractive. Among these methods, the heuristic evaluation proposed by Jakob Nielsen is the one that stands out. Although Nielsen offers general guidelines for the execution of heuristic evaluations, very few authors of different studies present a formal agreement or process on how the evaluations should be carried out, which leads us to the problem of the absence of a comparative analysis that allows determining the most appropriate formal evaluation process to carry out heuristic inspections. To complement, some proposals found in the literature compare the results of the execution of heuristic evaluations, the definition of new heuristics, qualification forms, and the formalization of the complete process in 5 phases. Although these proposals contribute to the formalization of the heuristic evaluation process, the literature review has not provided a comparative analysis that allows determining which is the most appropriate, which could cause usability evaluators to interpret and carry out their own procedure, which would lead to inaccuracies in the results and increase the probability of improperly executing the inspection. The purpose of this study was to elaborate a comparative table of the formal processes; this allowed the grouping of the various studies found in the literature, to select a process, and thus, and to model by a BPMN tool the whole process. Finally, this process modeled was validated by expert judgement of HCI specialists.
Content may be subject to copyright.
Selection and Modeling of a Formal Heuristic
Evaluation Process Through Comparative
Analysis
Adrian Lecaros(B), Arturo Moquillaza , Fiorella Falconi , Joel Aguirre ,
Alejandro Tapia , and Freddy Paz
Pontificia Universidad Católica del Perú, Av. Universitaria 1801, San Miguel, Lima 32, Lima,
Perú
{adrian.lecaros,ffalconit,aguirre.joel}@pucp.edu.pe,
{amoquillaza,a.tapiat,fpaz}@pucp.pe
Abstract. Due to the importance of usability, multiple usability evaluation meth-
ods have been proposed that help Human-Computer Interaction (HCI) specialists
determine whether the interfaces of a software product are usable, easy to use,
understandable, and attractive. Among these methods, the heuristic evaluation
proposed by Jakob Nielsen is the one that stands out. Although Nielsen offers
general guidelines for the execution of heuristic evaluations, very few authors of
different studies present a formal agreement or process on how the evaluations
should be carried out, which leads us to the problem of the absence of a com-
parative analysis that allows determining the most appropriate formal evaluation
process to carry out heuristic inspections. To complement, some proposals found
in the literature compare the results of the execution of heuristic evaluations, the
definition of new heuristics, qualification forms, and the formalization of the com-
plete process in 5 phases. Although these proposals contribute to the formalization
of the heuristic evaluation process, the literature review has not provided a com-
parative analysis that allows determining which is the most appropriate, which
could cause usability evaluators to interpret and carry out their own procedure,
which would lead to inaccuracies in the results and increase the probability of
improperly executing the inspection. The purpose of this study was to elaborate a
comparative table of the formal processes; this allowed the grouping of the vari-
ous studies found in the literature, to select a process, and thus, and to model by
a BPMN tool the whole process. Finally, this process modeled was validated by
expert judgement of HCI specialists.
Keywords: Human-Computer Interaction ·Usability ·Heuristic evaluation ·
Process ·BPMN
1 Introduction
Nowadays, usability is an essential aspect of the User Experience in the context of inter-
action between users and software products since it establishes a fundamental role in
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2022
M. M. Soares et al. (Eds.): HCII 2022, LNCS 13321, pp. 28–46, 2022.
https://doi.org/10.1007/978-3-031-05897-4_3
Selection and Modeling of a Formal Heuristic Evaluation Process 29
the use, acceptance, and interaction of users with those software products [1]. Due to
the importance of usability, multiple usability evaluation methods have been proposed
that help Human-Computer Interaction specialists determine whether the interfaces of
a software product are usable, easy to use, understandable, and attractive. Among these
methods, the heuristic evaluation proposed by Jakob Nielsen is the one that stands out
[2]. Although Nielsen offers general guidelines for the execution of heuristic evalua-
tions, very few authors of different studies present a formal agreement or process on
how the evaluations should be carried out, which leads us to the problem of the absence
of a comparative analysis that allows determining the most appropriate formal eval-
uation process to carry out heuristic inspections [3]. To complement, some proposals
found in the literature compare the results of the execution of heuristic evaluations, the
definition of new heuristics, qualification forms, and the formalization of the complete
process in 5 phases: planning, training, evaluation, discussion, and report. Although
these proposals contribute to the formalization of the heuristic evaluation process, the
literature review has not provided a comparative analysis that allows determining which
is the most appropriate, which could cause usability evaluators to interpret and carry
out their own procedure, which would lead to inaccuracies in the results and increase
the probability of improperly executing the inspection. As the objective of this study, a
comparative table of the formal processes was elaborated, and this allowed the grouping
of the various studies found in the systematic literature review, which have used proto-
cols or formal processes that contribute, to a certain extent, to the execution of heuristic
evaluations. This grouping made it possible to measure and compare the characteristics,
the degree of coverage, and the benefits of each one, and the result led to a complete
formal process. This result was achieved using a systematic review of the literature,
where the question sought to be answered: “what are the characteristics of the protocols
or formal processes that are being used to improve the performance of heuristic eval-
uations of software products?”. The comparison of the characteristics gave as a result
that the formal process that is part of the category “Formal processes with duly defined
steps” obtained the best results by providing a positive response for each of the criteria
when contributing to the entire evaluation process heuristics through duly defined steps
and by consolidating the different ways in which the evaluations are carried out into
one, allowing the interpretations of the usability evaluators to be reduced in how to carry
them out.
Additionally, the detailed diagram of the selected process wasmade with the objective
of modeling in detail the steps and tasks that must be followed to carry out the process
correctly and without room for different interpretations by the evaluators who use it.
In addition, the main process has been divided into the execution of 5 sub-processes
that correspond to the steps indicated in the selected formal process: (1) planning, (2)
training, (3) evaluation, (4) discussion and (5) report. This result was achieved using the
Bizagi BPMN Modeler tool [4], which allowed the design and modeling of the process
through the BPMN notation. This modeling consisted of 6 flows that are presented in this
research. All these deliverables were validated by expert judgement of HCI specialists.
This paper is structured as follows: In Sect. 2, we describe the main concepts belong-
ing to the Human-Computer Interaction area used in the study. In Sect. 3, we present the
result of the systematic literature review that answers the question of the characteristics
30 A. Lecaros et al.
of the protocols or formal processes that are being used to improve the performance
of heuristic evaluations of software products. In Sect. 4, we present the comparative
analysis and discussion of the protocols or formal processes and the selected process for
this research. In Sect. 5, we present the modeling of the selected formal process using
the Bizagi BPMN Modeler tool, and its validation. Finally, in Sect. 6, we present the
conclusions of the research and the future works to be done.
2 Background
In this section, we present the main concepts related to this work.
2.1 Usability
Usability, according to ISO 9241-210-2019 [5], is the “extend to which a system, product
or service can be used by specified users to achieve specified goals with effectiveness,
efficiency, and satisfaction in a specified context of use”.
Additionally, Jacob Nielsen [6] defines it as the evaluation of five attributes the user
interface of a system must have, which are the following:
Learnability: The system should be easy to learn so that the user can perform some
tasks with the system as quickly as possible.
Efficiency: The system should be efficient during its use to provide the highest level
of productivity possible.
Memorability: The system should be easy to remember so that the casual user can use
it again after a period of leaving it, without the need to relearn how it works.
Errors: The system should provide a low error rate so that users make as few errors
as possible and can quickly recover from them. Errors considered cata-strophic must
not occur.
Satisfaction: The system should be pleasant to use so that users are subjectively
satisfied while using it.
2.2 Heuristic Evaluation
According to Andreas Holzinger [7], Heuristic Evaluation (HE) belongs to us-ability
inspection methods and is the most common informal method. For its exe-cution, it
requires usability experts who can identify if the dialogue elements or other interactive
software elements follow the established principles of usability.
According to Jacob Nielsen [6], the heuristic evaluation allows the inspection of
what is good and bad in the interface of a system, which could be done through one’s
own opinion or, ideally, using well-defined guidelines. The author also maintains that the
main goal of the evaluation is to find usability problems in the design of an interface that
is carried out through a group of evaluators who will inspect and judge it through usability
principles called heuristics. Additionally, a single evaluator can find only 35% of the
usability problems in an interface; however, each evaluator usually encounters different
types of problems; for this reason, he recommends the participation of 3 to 5 evaluators
to obtain the best cost-benefit ratio. These evaluations are performed singularly, and
then, upon completion, the results are compared for overall usability analysis.
Selection and Modeling of a Formal Heuristic Evaluation Process 31
2.3 Systematic Literature Review
According to Barbara Kitchenham and Stuart Charters [8], Systematic Literature Review
(SLR) consists of identifying, evaluating, and interpreting the most relevant available
studies and answering review questions about an area or phenomenon of interest. Addi-
tionally, for this study, we will use the formal protocol proposed by Kitchenham that
consists of the review being carried out in three phases: (1) planning, (2) conducting and
(3) reporting.
Planning consists of defining stages to identify the need for the review, as well
as the specification of research questions, developing and evaluating review protocols.
Additionally, the conduction will allow the identification of the review, selection of
primary studies, as well as the extraction, monitoring and synthesis of data. Finally, the
reporting phase will facilitate the specification of propagation mechanisms, as well as
provide formatting and review of the main report generated.
2.4 Heuristic Evaluation Formal Process
According to Freddy Paz [9], it consists of a framework that has a base in the analysis of
case studies that will provide a structured way to execute of the heuristic evaluation to
reduce the different interpretations that arise now of its use. An example of the formal
process is the one defined by the author, who establishes five phases for its execution:
(1) planning, (2) training, (3) evaluation, (4) discussion and (5) report.
3 Characteristics of Protocols or Formal Processes Used to Improve
the Performance of Heuristic Evaluations
According to our previous research’s systematic literature review [3], the answer to the
research question of “what are the characteristics of the formal protocols or processes
that are being used to improve the performance of heuristic evaluations in software
products?” was divided by grouping the information obtained by 33 studies into 6 main
protocols. Table 1shows the selected primary studies to answer the review question.
Table 1. Selected primary studies
ID Study Quote Search engine
S01 Usability Problem Areas on Key International and Key Arab
E-commerce Websites
[10]Scopus
S02 Usability evaluation of web-based interfaces for Type2
Diabetes Mellitus
[11]Scopus
S03 Heuristic Evaluations of Cultural Heritage Websites [12]Scopus
S04 PROMETHEUS: Procedural Methodology for Developing
Heuristics of Usability
[13]Scopus
S05 A formal protocol to conduct usability heuristic evaluations in
the context of the software development process
[14]Scopus
(continued)
32 A. Lecaros et al.
Table 1. (continued)
ID Study Quote Search engine
S08 Observation and heuristics evaluation of student web-based
application of SIPADU-STIS
[15]Scopus
S09 Usabilidad en sitios web oficiales de las universidades del
ecuador
[16]Scopus
S10 The E-health Literacy Demands of Australia’s My Health
Record: A Heuristic Evaluation of Usability
[17]Scopus
S11 Experimental validation of a set of cultural-oriented usability
heuristics: e-Commerce websites evaluation
[18]Scopus
S12 A heuristic evaluation on the usability of health information
websites
[19]Scopus
S13 Websites with multimedia content: A heuristic evaluation of
the medical/anatomical museums
[20]Scopus
S14 A Heuristic Evaluation for Deaf Web User Experience
(HE4DWUX)
[21]Scopus
S15 A comparative study of video content user interfaces based on
heuristic evaluation
[22]Scopus
S16 Heuristic evaluation for Virtual Museum on smartphone [23]Scopus
S17 A User-Centered Design for Redesigning E-Government
Website in Public Health Sector
[24]Scopus
S18 Developing Usability Heuristics: A Formal or Informal
Process?
[25]Scopus
S20 Método para la evaluación de usabilidad de sitios web
transaccionales basado en el proceso de inspección heurística
[9]Alicia Concytec
S21 Usability testing of conferences websites: A case study of
practical teaching
[26]Scopus
S22 Evaluating the usability of the information architecture of
academic library websites
[27]Scopus
S23 A perception study of a new set of usability heuristics for
transactional websites
[28]Scopus
S24 Comparing the effectiveness and accuracy of new usability
heuristics
[29]Scopus
S26 Quantifying the usability through a variant of the traditional
heuristic evaluation process
[30]Scopus
S27 University students’ heuristic usability inspection of the
national library of Turkey website
[31]Scopus
S28 Usability heuristics evaluation in search engine [32]Scopus
S29 A collaborative RESTful cloud-based tool for management of
chromatic pupillometry in a clinical trial
[33]Scopus
(continued)
Selection and Modeling of a Formal Heuristic Evaluation Process 33
Table 1. (continued)
ID Study Quote Search engine
S30 Heuristic Evaluation of eGLU-Box: A Semi-automatic
Usability Evaluation Tool for Public Administrations
[34]Scopus
S31 The Relationship of the Studies of Ergonomic and Human
Computer Interfaces A Case Study of Graphical Interfaces in
E-Commerce Websites
[35]Scopus
S32 Evaluation of usability heuristics for transactional web sites: A
comparative study
[36]Scopus
S33 Heuristics for grid and typography evaluation of art magazines
websites
[37]Scopus
S34 Usability of tourism websites: a case study of heuristic
evaluation
[38]Scopus
S35 Programmer experience: a set of heuristics for programming
environments
[39]Scopus
S36 Exploring the usability of the central library websites of
medical sciences universities
[40]Scopus
S37 Heuristic Usability Evaluation of University of Hong Kong
Libraries’ Mobile Website
[41]Scopus
Also, Table 2shows the protocols or formal processes and the studies in which they
were found.
Table 2. Studies that include protocols or formal processes used for heuristic evaluations
Protocol or formal process Research studies # Studies
Definition of new heuristics S01, S03, S04, S11, S14, S18, S22, S23,
S24, S29, S31, S32, S33, S35
14
Nielsen derived evaluation items S12, S13, S15, S21, S27, S28, S30, S34,
S36, S37
10
Comparison and grouping between
evaluations
S01, S08, S10, S17, S18, S23, S24, S32 8
Qualification methods S02, S16, S34 3
Methodologies applied to heuristic
evaluation
S09, S22, S26 3
Formal processes with properly
defined steps
S05, S20 2
34 A. Lecaros et al.
Descriptions of formal protocols or processes and their benefits are presented below.
1. Definition of new heuristics: Definition of new heuristics is a fundamental protocol
applied in the execution of heuristic evaluations since, despite having the Nielsen
heuristics, these will not always be able to cover what is required for the usability
evaluation of a particular software [S01 & S03], as well as relate to the object of study
[S04]. For this reason, the benefits obtained with the definition of new heuristics are
the best representation of the case study [S01] that could not be performed using only
Nielsen heuristics [S03] and obtaining usability problems that are desired to be found
in a particular system [S04, S11, S14, S22, S23, S24, S29, S31, S32, S33 & S35].
Additionally, methodologies applied to the generation of new heuristics [S18] such
as PROMETHEUS are presented, which will allow dividing the process into stages
that validate that the heuristics are being properly constructed, iterating through a
refinement step if necessary [S04].
2. Nielsen derived evaluation items: One of the protocols used for the evaluation that
is derived from Nielsen is the use of its ten heuristics in conjunction with the eight
Schneiderman golden rules for grouping them into evaluation elements [S12]. Addi-
tionally, in another protocol, phases are used for the application of the evaluation,
which consist of the initial navigation on the website to judge the flow, perception
and dynamics of the interaction, and then, in the second phase, use the metrics of
Nielsen usability heuristics [S13]. Other studies show that Nielsen heuristics could
be used in conjunction with metrics to enrich the collected data [S27], with attributes
defined in ISO / IEC 25023 [S28] and in conjunction with case study-specific heuris-
tics [S34]. Finally, the ten Nielsen heuristics could also be applied by themselves to
perform usability evaluations [S15, S21, S30, S36 & S37].
3. Comparison and grouping between evaluations: Comparison of results of the exe-
cution of heuristic evaluations between the country to be evaluated versus key inter-
national pages allows providing improvement indicators and recommendations of
good practices [S01]. Likewise, evaluation questions grouping allows comparisons
between the generated groups to be made, in order to obtain more precise answers
[S08]. Additionally, dividing heuristic evaluations into groups by used heuristics
allows comparisons to be made of how many usability problems have been found
with each one [S10]. Complementarily, if a heuristic evaluation is made to a website
and then an additional one is made to the improved site as a result of the first evalua-
tion, it will be possible to quantitatively compare how much the score has improved
[S17]. Finally, comparing test execution with Nielsen heuristics in contrast to newly
designed heuristics allows the measurement of their effectiveness [S18, S23, S24 &
S32].
4. Qualification methods: The use of Nielsen severity rating and Olson’s ease of fix will
allow providing a quantitative assessment according to the severity of the usability
problem encountered and how easy or difficult it will be to fix it [S02]. Likewise,
another form of rating is the use of a heuristic evaluation questionnaire that consists
of assigning a score of five points on the Likert scale for the evaluation of prototypes
[S16], as well as adding structured sub-elements to the defined heuristics whose
questions can be used by evaluators. [S34].
Selection and Modeling of a Formal Heuristic Evaluation Process 35
5. Methodologies applied to heuristic evaluation: The protocols oriented to heuristic
evaluation methodologies show the proposal for the use of a heuristic method based
on ISO-9241–151 standard that consists of the execution of a heuristic evaluation
by applying indicators divided into criteria that allow each indicator to be assessed,
which facilitates the evaluation of results by the criterion as well as in general [S09].
In addition, a framework is defined for the evaluation of dialogue elements in web
pages so that the evaluators consider the same previously defined elements to be
evaluated, as well as a definition of a workflow that allows its applicability in other
types of software [S22]. Finally, the definition of a quantitative evaluation procedure
of the results of a heuristic evaluation is proposed to obtain consistent and quantifiable
results without the need for user participation [S26].
6. Formal processes with properly defined steps: The heuristic inspection process can
be fully formalized through five duly defined steps: (1) planning, (2) training, (3)
evaluation, (4) discussion, and (5) report [S05 & S20]. In this way, this formalization
consolidates the different ways in which evaluations are carried out into a single one,
allowing interpretations on how to carry them out to be reduced [S05 & S20].
4 Comparative Analysis and Discussion
After grouping the studies according to their main characteristics, a comparative table
was made that allowed the comparison of said protocols or formal processes, in general
criteria such as the level of coverage, which consisted of qualifying the contribution
to the heuristic evaluation process to classify them in one of the following levels as
appropriate: Low, Medium and High. In addition, the benefits found and the types of
software evaluated by each one were shown. Finally, the definition of new heuristics
protocol was not taken into consideration, since, although it is a formal process, the
direct contribution to the heuristic evaluation process is not as significant as it only
results in new heuristics for inspection. Table 3shows the characteristics mentioned.
Table 3. Characteristics of the protocols or formal processes
Protocol or formal
process
Coverage level of the
heuristic evaluation
process
Found benefits Application
Nielsen derived
evaluation items
Medium: They contribute
mainly to the execution of
the heuristic evaluation by
using the Nielsen
heuristics to carry out the
evaluation together with
additional elements that
allow improving the
inspection
(1) They allow for
quantitative results when
using metrics related to
Nielsen’s heuristics
(2) They allow the
heuristic evaluation to be
carried out, covering the
demands proposed by
conventional professional
standards
(3) They allow adding
additional heuristics in
conjunction with the
Nielsen heuristics
Health information
websites, medical
museums, conferences,
libraries, and tourism, as
well as video content
interfaces
(continued)
36 A. Lecaros et al.
Table 3. (continued)
Protocol or formal
process
Coverage level of the
heuristic evaluation
process
Found benefits Application
Comparison and
grouping between
evaluations
Low: They contribute
mainly to the comparison
of results of heuristic
evaluations by proposing
more than one form of
execution to finally
evaluate which is the best
(1) They allow usability
evaluation results to be
compared to generate
good practice
recommendations
(2) They allow us to
compare how much the
results of a new system
have improved compared
to the initial one
(3) They allow to measure
the effectiveness between
groups of heuristics
E-commerce, student,
health, and transactional
websites
Qualification
methods
Medium: They contribute
mainly to the execution of
the heuristic evaluation by
providing protocols that
allow formalizing the
qualification during the
inspection
(1) They allow
determining and
prioritizing which are the
usability problems that
must be solved as soon as
possible
(2) They allow the
structuring of questions
for the evaluators and that
their answers are acquired
quickly
Diabetes screening and
tourism websites as well
as virtual museum mobile
apps
Methodologies
applied to heuristic
evaluation
Medium: They contribute
mainly to the execution of
the heuristic evaluation
through indicators and the
inspection of dialogue
elements. In addition, they
contribute to the results as
they are considered
procedures for obtaining
quantitative results
(1) They allow
assessments to be given
through indicators that
will facilitate the
evaluation of the result by
criteria and in total
(2) They allow all
evaluations to consider the
same navigation elements
(3) They allow obtaining
consistent and quantifiable
results of the evaluation
carried out
University and library
websites
Formal processes
with properly
defined steps
High: They contribute to
the entire heuristic
evaluation process through
duly defined steps:
planning, training,
evaluation, discussion and
reporting
Consolidates the various
ways in which evaluations
are carried out into one,
allowing the
interpretations of usability
evaluators in how to carry
them out to be reduced
Transactional websites
Selection and Modeling of a Formal Heuristic Evaluation Process 37
Additionally, to carry out the comparative analysis, a comparative analysis matrix
was prepared and it allowed the characteristics of each process to be compared through
the following criteria:
Applied contexts: It is verified if it has been applied in an academic or industrial
context or in both.
Types of software evaluated: These are the types of software present in the various
studies
Does it divide the evaluation into duly defined steps?: It is verified if the studies present
the protocol or formal process with indications of steps that must be followed to carry
out the inspection.
Is it a complement or a new proposal to the evaluation process proposed by Nielsen?:
It is verified if the way of carrying out the evaluation is an adaptation of the process
proposed by Nielsen that includes additional characteristics, or if a new process has
been proposed.
Does it include previous training for evaluators?: It is verified if the previous training
for evaluators is included as part of the protocol or formal process.
The validation of the submitted proposal is evidenced: It is verified if the protocol or
formal process used in the studies has been validated.
Result of the evidence: Detail of the evidence of the validation of the proposal, if any.
Coverage level of the Heuristic Evaluation: Measurement of the coverage of the
evaluation, which can be (1) Low, (2) Medium or partial, (3) High or total.
Description of coverage: Justification of the level of coverage of the evaluation.
Benefits found: Detail of the benefits found by the use of the protocol or formal
process.
Table 4shows a summary of the comparative analysis matrix with the result of its
five measurable criteria that were considered because they were the ones that allowed a
comparative analysis between the protocols and formal processes to give a measurable
answer in how much was contributed in the heuristic evaluation process.
Table 4. Summary of the comparative analysis matrix
Protocol or
formal process
Does it
divide the
evaluation
into duly
defined
steps?
Is it a
complement or
a new proposal
to the evaluation
process
proposed by
Nielsen?
Does it
include prior
training for
evaluators?
The
validation of
the submitted
proposal is
evidenced
Coverage
level of the
Heuristic
Evaluation
Nielsen derived
evaluation items
No Complement No No Medium or
partial
Comparison and
grouping
between
evaluations
No Complement No No Low
(continued)
38 A. Lecaros et al.
Table 4. (continued)
Protocol or
formal process
Does it
divide the
evaluation
into duly
defined
steps?
Is it a
complement or
a new proposal
to the evaluation
process
proposed by
Nielsen?
Does it
include prior
training for
evaluators?
The
validation of
the submitted
proposal is
evidenced
Coverage
level of the
Heuristic
Evaluation
Qualification
methods
Yes (one
study)
Complement No No Medium or
partial
Methodologies
applied to
heuristic
evaluation
No Complement No No Medium or
partial
Formal
processes with
properly defined
steps
Yes New proposal Yes Yes High or
total
The comparison of the characteristics gave as a result that the formal process that
is part of the category “Formal processes with properly defined steps” obtained the best
results by providing a positive response for each of the criteria by contributing to the
entire heuristic evaluation process through duly defined steps and by consolidating the
different ways in which evaluations are carried out in one, allowing the interpretations of
usability evaluators on how to carry them out to be reduced. For this reason, this formal
process was selected as the basis of this research project.
5 Selected Formal Process BPMN Model
The detailed diagram of the selected process was made with the aim of modeling in
detail the steps and tasks that must be followed to carry out the process properly, and
that there is no room for different interpretations by the evaluators who use it. In addition,
the process has been divided into the main flow that consists of the execution of 5 sub-
processes that correspond to the steps indicated in the selected formal process [9]: (1)
planning, (2) training, (3) evaluation, (4) discussion and (5) report.
This result was achieved using the Bizagi BPMN Modeler tool, which allowed the
design and modeling of the process through the BPMN notation which consisted of 6
flows that are detailed below:
General: Flow in which all the threads that must be followed to carry out the process
in its entirety are described. Figure 1shows the BPMN diagram of said process flow.
Selection and Modeling of a Formal Heuristic Evaluation Process 39
Fig. 1. General process flow diagram
Planning: Flow that defines the initial phase related to the preparation of the heuristic
evaluation. In this phase, the evaluation manager carries out the preliminary steps
to carry out the evaluation. Likewise, the objective and scope of the evaluation, the
software product to be evaluated, the profiles and the number of evaluators that will
make up the team, the set of heuristics and the evaluation template to be used are
defined. Once the above has been defined, the professionals who will form part of
the evaluation team are recruited. Figure 2shows the BPMN diagram of said process
flow.
Fig. 2. Planning flow diagram
40 A. Lecaros et al.
Training: Flow that defines the training phase for evaluators with little or no experi-
ence. In this phase, the evaluation manager prepares a training session for the eval-
uators who belong to the mentioned criteria. Once the training has been completed,
the manager informs the evaluation team of the goals, objectives, framework criteria
and a general idea of the software product, as well as the type of evaluation to be
carried out, so that the team can then carry out a free exploration of system interfaces.
Figure 3shows the BPMN diagram of said process flow.
Fig. 3. Training flow diagram
Evaluation: Flow that defines the execution phase of the heuristic evaluation of the
interfaces of the selected software product. In this phase, the evaluation team examines
the interfaces and depending on the experience of the evaluator; the flow will be
continued in one of two ways:
(1) If the evaluator has experience, he should carry out the evaluation by identifying
the usability problems in the interfaces by taking screenshots of the problems
found, to later associate them with the heuristic principles that were not fulfilled
and document them in the evaluation template.
(2) If the evaluator has little or no experience, he should first familiarize himself
with the heuristic principles and then proceed with the evaluation following one
of two options: (a) Reviewing the complete list of heuristics to identify problems
according to what is learned and remember at the time, taking screenshots and
documenting your findings in the assessment template. (b) Identifying problems,
taking screenshots, and documenting in the evaluation template considering only
one heuristic at a time.
Selection and Modeling of a Formal Heuristic Evaluation Process 41
Figure 4shows the BPMN diagram of said process flow.
Fig. 4. Evaluation flow diagram
Discussion: Flow that defines the elaboration phase of the final list of usability prob-
lems by determining if the problem has been found by more than one evaluator or if it
really refers to a usability problem. In this phase, the evaluation manager verbalizes
each of the problems so that the evaluator who identified the problem can provide
more details and evidence of each one. Then, the team determines if the identified
aspect really refers to a usability problem, if it is not considered so, the evaluator will
determine if it still considers that it is a usability problem, otherwise, it is discarded.
If it is a relevant aspect and the problem was identified by more than one evaluator,
it must be determined whether it is the same incident or different usability aspects.
If it is the same incidence, a definition and description of the problem is prepared
collaboratively, otherwise, it is classified as a usability problem to be discussed later.
Then, if the identified aspect was classified as a usability problem, it must be
determined if it clearly expresses the incident to which it refers. If not, a definition
and description of the identified problem is prepared. Finally, this problem is included
in the final list.
42 A. Lecaros et al.
Figure 5shows the BPMN diagram of said process flow.
Fig. 5. Discussion flow diagram
Report: Flow that defines the assignment of a severity and frequency value to each
problem, to determine which are the problems that should be solved most urgently.
In this phase, the assessment manager sends the consolidated list to each member of
the team. Then, each evaluator must rate the severity and frequency of each problem
and then average the scores assigned individually and calculate the standard deviation
together with the ratings made by the entire team. In case the standard deviation
value is high, the reasons for assigning the severity and frequency values should be
discussed, to later reach a consensus on the criticality.
Finally, the evaluation team must offer possible solutions for each of the problems,
also highlighting the positive aspects of the proposed interfaces, to prepare a final report
that describes all the results of the heuristic evaluation process. Figure 6shows the
BPMN diagram of said process flow.
For the validation of the modeling realized, an HCI specialist was interviewed who
validated 100% of the result by confirming that all the criteria were successfully met, as
presented in the interview. These criteria were the following:
The detailed diagram of the formal heuristic evaluation process selected in BPMN
format was presented using the Bizagi BPMN Modeler tool.
The flow of the process was clearly understood with no room for it to be interpreted
in more than one way.
The process was correctly modeled using standard BPMN notations for process
modeling.
The process was adequately explained, and the questions asked about it were clearly
answered.
Selection and Modeling of a Formal Heuristic Evaluation Process 43
Fig. 6. Report flow diagram
6 Conclusions and Future Works
The objective of this research was to select and model a formal process of heuristic
evaluation applicable to the usability inspection process of software products through a
comparative analysis, whose results were the following: (1) a comparative table of the
formal processes documented in the literature to carry out a heuristic evaluation and (2)
a detailed diagram of the selected process modeled in BPMN.
The first result of this objective was achieved by developing acomparative table with
the formal processes obtained as part of the systematic review of the literature, where five
protocols or formal processes were evidenced that were compared through 5 measurable
criteria. After analyzing all the process proposals, it was possible to conclude that the
formal process “Formal processes with duly defined steps” was the most appropriate,
because it fully covered the heuristic evaluation process, complied with dividing the
evaluation into duly defined steps and because it included prior training for evaluators
unlike the others.
Additionally, it was presented as a new proposal to the evaluation process proposed
by Nielsen, where its validation was evidenced through case studies that allowed to find
a greater number of severe and critical problems, as well as that the evaluators tend to
make fewer errors compared to the Nielsen process.
The second result of this objective was achieved by modeling the process through the
BPMN notation, where the flow that must be followed to carry out the process properly
was presented in detail. The process modeled was validated by expert judgement.
By complying with the two mentioned results, it is concluded that the objective
has been met, since a solution has been given to the problem caused by the absence
of a comparative analysis of the formal processes documented in the literature, having
prepared a table with criteria that allowed to quantitatively and qualitatively evaluate
the formal processes found in the systematic review of the literature, and to be able to
recommend the process with the best indicators after reviewing the results obtained.
44 A. Lecaros et al.
For future works, the formal process selected in this study will be taken into con-
sideration as the basis for developing a software tool that allows the automation of the
heuristic evaluation process, since, in this way, the well-defined steps could be repre-
sented in said system as support to usability evaluators in their heuristic inspections
regardless of their experience level.
Acknowledgments. This work is part of the research project “Virtualización del proceso de eval-
uación de experiencia de usuario de productos de software para escenarios de no presencialidad”
(virtualization of the user experience evaluation process of software products for non-presential
scenarios), developed by HCI-DUXAIT research group. HCI-DUXAIT is a research group that
belongs to the PUCP (Pontificia Universidad Católica del Perú).
This work was funded by the Dirección de Fomento a la Investigación at the PUCP through
grant 2021-C-0023.
References
1. Li, Y., Liu, C.: User experience research and analysis based on usability testing methods.
In: Zhao, P., Ye, Z., Xu, M., Yang, Li., Zhang, L., Zhu, R. (eds.) Advances in Graphic
Communication, Printing and Packaging Technology and Materials. LNEE, vol. 754, pp. 263–
267. Springer, Singapore (2021). https://doi.org/10.1007/978-981-16-0503-1_39
2. Nielsen, J.: Usability inspection methods. In: Conference Companion on Human Factors in
Computing Systems. pp. 413–414 (1994). DOI: https://doi.org/10.1145/259963.260531
3. Lecaros, A., Paz, F., Moquillaza, A.: Challenges and opportunities on the application of heuris-
tic evaluations: a systematic literature review. In: Soares, M.M., Rosenzweig, E., Marcus, A.
(eds.) HCII 2021. LNCS, vol. 12779, pp. 242–261. Springer, Cham (2021). https://doi.org/
10.1007/978-3-030-78221-4_17
4. Goetz, M.: Modeling Workflow Patterns through a Control-flow perspective using BPMN
and the BPM Modeler BizAgi. Bpmn. (2009)
5. International Organization for Standardization: ISO 9241–210–2019: (2019)
6. Nielsen, J.: Usability Engineering. Morgan Kaufmann Publishers Inc., San Francisco, CA,
USA (1993)
7. Holzinger, A.: Usability engineering methods for software developers. Commun. ACM. 48,
71–74 (2005). https://doi.org/10.1145/1039539.1039541
8. Kitchenham, B., Charters, S.: Guidelines for performing Systematic Literature Reviews in
Software Engineering (2007)
9. Paz Espinoza, F.A.: Método para la evaluación de usabilidad de sitios web transaccionales
basado en el proceso de inspección heurística [Doctoral Thesis]. Pontif. Univ. Católica del
Perú. 275 (2018). http://hdl.handle.net/20.500.12404/9903
10. Hasan, L., Morris, A.: Usability problem areas on key international and key arab e-commerce
websites. J. Internet Commer. 16, 80–103 (2017). https://doi.org/10.1080/15332861.2017.
1281706
11. Davis, D., Jiang, S.: Usability evaluation of web-based interfaces for Type2 Diabetes Mellitus.
In: IEOM 2015 - 5th International Conference Industrial Engineering Operations Management
Proceeding (2015). https://doi.org/10.1109/IEOM.2015.7093713
12. Lam, D., Sajjanhar, A.: Heuristic evaluations of cultural heritage websites. In: 2018 Interna-
tional Conference Digital Image Computing Techniques Appllications DICTA 2018 (2019).
https://doi.org/10.1109/DICTA.2018.8615847
Selection and Modeling of a Formal Heuristic Evaluation Process 45
13. Jiménez, C., Cid, H.A., Figueroa, I.: PROMETHEUS: procedural methodology for developing
heuristics of usability. IEEE Lat. Am. Trans. 15, 541–549 (2017). https://doi.org/10.1109/
TLA.2017.7867606
14. Paz, F., Paz, F.A., Pow-Sang, J.A., Collazos, C.: A formal protocol to conduct usability
heuristic evaluations in the context of the software development process. Int. J. Eng. Technol.
7(2.28), 10–19 (2018). https://doi.org/10.14419/ijet.v7i2.28.12874
15. Maghfiroh, L.R.: Observation and heuristics evaluation of student web-based application of
SIPADU-STIS. J. Phys. Conf. Ser. 1511, 0–10 (2020) https://doi.org/10.1088/1742-6596/
1511/1/012019
16. Pincay-Ponce, J., Caicedo-Ávila, V., Herrera-Tapia, J., Delgado-Muentes, W., Delgado-
Franco, P.: Usabilidad en sitios web oficiales de las universidades del ecuador. Rev. Ibérica
Sist. e Tecnol. Informação. 106–120 (2020)
17. Walsh, L., Hemsley, B., Allan, M., Adams, N., Balandin, S., Georgiou, A., Higgins, I.,
McCarthy, S., Hill, S.: The e-health literacy demands of Australia’s my health record: a
heuristic evaluation of usability. Perspect. Heal. Inf. Manag. 14 (2017)
18. Díaz, J., Rusu, C., Collazos, C.A.: Experimental validation of a set of cultural-oriented usabil-
ity heuristics: e-Commerce websites evaluation. Comput. Stand. Interfaces. 50, 160–178
(2017). https://doi.org/10.1016/j.csi.2016.09.013
19. Chen, Y.N., Hwang, S.L.: A heuristic evaluation on the usability of health information web-
sites. In: Bridging Research and Good Practices towards Patient Welfare - Proceedings of the
4th International Conference on HealthCare Systems Ergonomics and Patient Safety, HEPS
2014, pp. 109–116 (2015)
20. Kiourexidou, M., Antonopoulos, N., Kiourexidou, E., Piagkou, M., Kotsakis, R., Natsis, K.:
Websiteswith multimedia content: A heuristic evaluation of the medical/anatomical museums.
Multimodal Technol. Interact. 3(2), 42 (2019). https://doi.org/10.3390/mti3020042
21. Yeratziotis, A., Zaphiris, P.: A Heuristic Evaluation for Deaf Web User Experience
(HE4DWUX). Int. J. Hum. Comput. Interact. 34, 195–217 (2018). https://doi.org/10.1080/
10447318.2017.1339940
22. Eliseo, M.A., Casac, B.S., Gentil, G.R.: A comparative study of video content user inter-
faces based on heuristic evaluation. In: Iberian Conference Information Systems Technologies
CISTI, pp. 1-6 (2017). https://doi.org/10.23919/CISTI.2017.7975820
23. Motlagh Tehrani, S.E., Zainuddin, N.M.M., Takavar, T.: Heuristic evaluation for Virtual
Museum on smartphone. In: Proceeding - 2014 3rd International Conference User Science
Engineering Experimental Engineering Engag. i-USEr 2014, pp. 227–231 (2015) https://doi.
org/10.1109/IUSER.2014.7002707
24. Puspitasari, I., Indah Cahyani, D.: Taufik:a user-centered design for redesigning e-government
website in public health sector. In: Proceeding - 2018 International Seminar Application
Technology Information Communication Creat. Technology Human Life, iSemantic 2018,
pp. 219–224 (2018) https://doi.org/10.1109/ISEMANTIC.2018.8549726
25. Quinones, D., Rusu, C., Roncagliolo, S., Rusu, V., Collazos, C.A.: Developing usability
heuristics: a formal or informal process? IEEE Lat. Am. Trans. 14, 3400–3409 (2016). https://
doi.org/10.1109/TLA.2016.7587648
26. Halaweh, M.: Usability testing of conferences websites: a case study of practical teaching. In:
Uden, L., Hadzima, B., Ting, I.-H. (eds.) KMO 2018. CCIS, vol. 877, pp. 380–389. Springer,
Cham (2018). https://doi.org/10.1007/978-3-319-95204-8_32
27. Silvis, I.M., Bothma, T.J.D., de Beer, K.J.W.: Evaluating the usability of the information
architecture of academic library websites. Libr. Hi Tech. 37, 566–590 (2019). https://doi.org/
10.1108/LHT-07-2017-0151
28. Paz, F., Paz, F.A., Arenas, J.J., Rosas, C.: A Perception study of a new set of usability heuristics
for transactional web sites. In: Karwowski, W., Ahram, T. (eds.) IHSI 2018. AISC, vol. 722,
pp. 620–625. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-73888-8_96
46 A. Lecaros et al.
29. Paz, F., Paz, F.A., Pow-Sang, J.A.: Comparing the effectiveness and accuracy of new usability
heuristics. Adv. Intell. Syst. Comput. 497, 163–175 (2017). https://doi.org/10.1007/978-3-
319-41956-5_16
30. Paz, F., Paz, F.A., Sánchez, M., Moquillaza, A., Collantes, L.: Quantifying the usability
through a variant of the traditional heuristic evaluation process. In: Marcus, A., Wang, W.
(eds.) DUXU 2018. LNCS, vol. 10918, pp. 496–508. Springer, Cham (2018). https://doi.org/
10.1007/978-3-319-91797-9_36
31. Inal, Y.: University students’ heuristic usability ınspection of the national library of Turkey
website. Aslib J. Inf. Manag. 70, 66–77 (2018). https://doi.org/10.1108/AJIM-09-2017-0216
32. dos Santos Pergentino, A.C., Canedo, E.D., Lima, F., de Mendonça, F.L.L.: Usability heuris-
tics evaluation in search engine. In: Marcus, A., Rosenzweig, E. (eds.) HCII 2020. LNCS,
vol. 12200, pp. 351–369. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-49713-
2_25
33. Iadanza, E., Fabbri, R., Luschi, A., Melillo, P., Simonelli, F.: A collaborative RESTful cloud-
based tool for management of chromatic pupillometry in a clinical trial. Heal. Technol. 10(1),
25–38 (2019). https://doi.org/10.1007/s12553-019-00362-z
34. Federici, S., et al.: Heuristic evaluation of eglu-box: a semi-automatic usability evaluation
tool for public administrations. In: Kurosu, M. (ed.) HCII 2019. LNCS, vol. 11566, pp. 75–86.
Springer, Cham (2019). https://doi.org/10.1007/978-3-030-22646-6_6
35. de Menezes, M., Falco, M.: The relationship of the studies of ergonomic and human computer
interfaces a case study of graphical interfaces in e-commerce websites. In: Marcus, A., Wang,
W. (eds.) HCII 2019. LNCS, vol. 11586, pp. 474–484. Springer, Cham (2019). https://doi.
org/10.1007/978-3-030-23535-2_35
36. Paz, F., Paz, F.A., Pow-Sang, J.A.: Evaluation of usability heuristics for transactionalweb
sites: a comparative study. Adv. Intell. Syst. Comput. 448, 1063–1073 (2016) https://doi.org/
10.1007/978-3-319-32467-8_92
37. Retore, A.P., Guimarães, C., Leite, M.K.: Heuristics for grid and typography evaluation of art
magazines websites. In: Kurosu, M. (ed.) HCI 2016. LNCS, vol. 9731, pp. 408–416. Springer,
Cham (2016). https://doi.org/10.1007/978-3-319-39510-4_38
38. Huang, Z.: Usability of tourism websites: a case study of heuristic evaluation. New Rev.
Hypermedia Multimed. 26, 55–91 (2020). https://doi.org/10.1080/13614568.2020.1771436
39. Morales, J., Rusu, C., Botella, F., Quiñones, D.: Programmer experience: a set of heuristics
for programming environments. In: Meiselwitz, G. (ed.) HCII 2020. LNCS, vol. 12195,
pp. 205–216. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-49576-3_15
40. Okhovati, M., Karami, F., Khajouei, R.: Exploring the usability of the central library websites
of medical sciences universities. J. Librariansh. Inf. Sci. 49, 246–255 (2017). https://doi.org/
10.1177/0961000616650932
41. Fung, R.H.Y., Chiu, D.K.W., Ko, E.H.T., Ho, K.K.W., Lo, P.: Heuristic usability evaluation of
university of hong kong libraries’ mobile website. J. Acad. Librariansh. 42, 581–594 (2016).
https://doi.org/10.1016/j.acalib.2016.06.004
... We focused on the Evaluation Phase due to the lack of tools that support conducting the methods in an integrated perspective. As a first approach, in the studies of Lecaros, A. [5] and Tapia, A. [6], The authors proposed two processes to conduct standard Heuristic Evaluation and tree testing, respectively, two of the most used evaluation usability methods. However, the processes only support one test each, the specific one for which they were designed. ...
Conference Paper
Full-text available
During the COVID-19 pandemic, conducting UX evaluations was difficult , increasing the pressure to develop usable software. For this objective, different methods and tools for the evaluation of usability were delivered, but no clear process or method to use in a determinate context was found in the literature. In that sense, previous studies showed interest in a formal process to conduct Usability evaluations. First, a formal process to conduct Heuristic Evaluation was proposed by Lecaros, A. followed by the process to conduct Tree Testing, proposed by Tapia A., both proposals were integrated into a process to conduct generic UX evaluations. In this study, we developed a software tool that supports the integrated process previously proposed by the authors, DUXAIT-NG, which is an academic software for usability and User Experience Evaluation. This tool is intended to support the evaluation process, from gathering the participants to performing the evaluation and interpreting the results. DUXAIT NG is capable of handling different types of UX Evaluation and Testing methods. This software allows us to perform complete usability testing, not only the stage of design or planning, which was the problem evidenced in the literature with other software used in the Usability area. The first approach was software that supported a formal process for the Heuristic Evaluation, continued by another software for Tree Testing. As a team, we decided to develop a more general software to create a standard and support a unified process that allows the researchers to conduct any usability evaluation focused in a collaborative and remote environment. The first version is available on www.duxait-ng.net.
... Then, two methods to perform the heuristic evaluation were considered within the usability evaluations. The first, through the formal process proposed by Freddy Paz [8], which was selected as the most complete heuristic evaluation process in previous research [9], and the second, through the evaluation method proposed by Toni Granollers [4]. Figure 2 shows the phases of (1) planning, (2) execution, and (3) analysis to be able to carry out these evaluations. ...
Conference Paper
Full-text available
Heuristic evaluation is one of the most popular usability methods since a group of usability evaluation experts can find approximately 75% of all usability problems in the reviewed interfaces. Due to its benefits, there are various proposals on how to carry out a heuristic evaluation. One of them is the proposal by Granollers. Although this evaluation method is practical, direct, and valuable to obtain a quantitative result of the percentage of usability this continues to be carried out manually by using a template. As part of this research, it was proposed to develop a system that supports the heuristic evaluation method of T. Granollers. Likewise, to measure the perception of the usability evaluators who used this new system in comparison with the template, it was proposed to carry out a case study in which two groups of evaluators could perform a heuristic evaluation using the method of Granollers to a transactional e-commerce website. Team A used the template to carry out the evaluation, while Team B used the proposed support system. The results allowed us to demonstrate that the team that used the support tool for the heuristic evaluation method of Granollers had a better perception of the evaluated TAM criteria. For this reason, the system is easy to use, it is perceived as useful for conducting evaluations, and evaluators have a great interest in using it in their heuristic evaluations, which highlights the importance of promoting the use of heuristic evaluation methods through automated tools available to usability experts.
... In heuristic evaluation, evaluators assess whether the system's user interface design follows a set of predetermined standard principles and recognize their violations as usability problems. Usability problems such as incorrect and ambiguous messages of the system, unfamiliar language and feedback lacking important information, etc., cause a reduction in efficiency, confusion of users, system inefficiency, unproductive interaction of users with the system, and a decrease in productivity and user satisfaction [9]. ...
Article
Full-text available
Introduction: As university systems are dealing with a wide range of users, including students, managers, and university employees, these systems can likely satisfy the needs and activities of various users with maximum quality in the shortest possible time. Consequently, it is very essential to observe usability features in the design of such systems. The current study aimed to evaluate the usability of the Hamava system to recognize and resolve its problems and weaknesses.Material and Methods: The present study is a cross-sectional descriptive study that was done in the second half of 2022 on the Hamava system of students of Birjand University of Medical Sciences. Nielsen's Heuristics (ten principles) were used to check the compliance of this system with the usability principles. Descriptive statistics were used to analyze the overall severity of the identified usability factors collected using the checklist.Results: In the present evaluation, 176 problems were recognized, with the highest number of problems related to the violation of the two principles of aesthetic and minimalist design (n=58), user control and freedom (n=31), and the least problem related to the recognition rather than recall principle (n=4).Conclusion: The results of the current study revealed that though a large number of users of the Hamava system are students and it is expected that this system will be designed based on the latest standards and the needs of its users, this system also faces usability problems that if not resolved, it can cause an increase in errors, dissatisfaction of users, decrease in the quality of information, waste of users' time, and lack of effective user interaction with the educational system.
Article
Full-text available
En el contexto de una experiencia de rediseño de UX, se eligió el US Deparment of Labor como proyecto para analizar sus pain points y rediseñarlo. Se concluyeron todas las etapas del rediseño asignado, incluida la evaluación. Asimismo, en una iteración de validación de los resultados, se propuso obtener cuantitativamente el nivel de usabilidad del prototipo. Para ello, se seleccionó el método de evaluación heurística de Granollers. En este sentido, se propuso previamente un proceso unificado que soporta toda la evaluación UX, y se liberó la herramienta DUXAIT-NG. Entonces, se llevó a cabo un caso de estudio en el que se planificó la evaluación heurística remota, donde un equipo de 4 evaluadores expertos interactuó con el rediseño y, por último, se analizaron los resultados. Todas estas actividades se realizaron con la herramienta DUXAIT-NG. Esta herramienta no solo demostró proporcionar un apoyo adecuado a todo el proceso de evaluación, sino que también supuso una ventaja en comparación con la aplicación manual de la evaluación. Por último, según el método Granollers, el rediseño del sitio web obtuvo un nivel de usabilidad del 83,31%. Con estos resultados, se validó la usabilidad de la propuesta realizada por el equipo que llevó a cabo el rediseño. Además, el equipo obtuvo importantes conclusiones del análisis de los informes de los evaluadores.
Article
Full-text available
This study applied Nielsen's heuristics to assess the user interface of the Sport Analysis Application. The purpose of this application aided students in evaluating their performance based on the test results and adhering to the training guidelines derived from the application's recommendations. The primary framework used in this investigation is Nielsen's heuristics, coupled with experiments to gauge the extent to which the application's user interface aligns with user preferences (user-friendly). The study targeted a population of students, with a sample size of 100 individuals. Analysis and discussion were conducted through the distribution of questionnaires to participants. The research findings revealed that the Visibility of System Status score reached 151. Overall, the study concludes that the developed Sport Analysis Application successfully achieves its primary objectives with a user-friendly interface.
Conference Paper
This workshop will offer participants the opportunity to engage in a face-to-face learning experience. The main objective of this workshop is to equip participants with the essential skills and knowledge necessary to plan and conduct heuristic evaluations using the DUXAIT-NG platform. The proposed workshop will have two main axes: the theoretical part, where the most relevant concepts on heuristic evaluations will be developed, and the method proposed by Granollers will be detailed. Also, how to carry out a heuristic evaluation under a defined process. In the second part, the practical one, participants will be introduced to the DUXAIT-NG platform, and its main features will be detailed in a practical way, as well as how this platform supports the evaluation process detailed in the theoretical part. In this section, participants will interact with the platform, and the planning, conduction, and analysis of a case study will be presented, with the active participation of the attendants.
Chapter
Full-text available
The heuristic evaluation belongs to the usability inspection methods and is considered one of the most popular methods since it allows the discovery of over 75% of the total usability problems involving only 3 to 5 usability experts. However, certain problems and challenges have been identified during their execution. One of the identified problems is the lack of validation scenarios that demonstrate the contributions and benefits of incorporating systems that provide support for heuristic evaluation. Although there are case studies that could support the problems described above, the literature has shown that these are quite scarce and, for the most part, are software products that are used as support tools for certain elements to be evaluated, such as, for example, visual clarity of the system, page links validations and readability. For this reason, since there are no validation scenarios that demonstrate the contributions and benefits of the automation of heuristic evaluations through software products, the result is that there is a low level of confidence in usability evaluators to use a system that supports the heuristic evaluation process, for so the inspection continues to be carried out manually. As a part of previous research, a software product was implemented to support the five steps (planning, training, evaluation, discussion, and report) of a previously selected formal process to perform heuristic evaluations. To test the previously developed software product, a case study that demonstrates the contributions and benefits of incorporating the web application developed as a support tool for heuristic evaluation was proposed, and it has been of the utmost importance since it has allowed performing a comparative analysis between the evaluation when is performed by using the selected formal process through a template contained in an MS Excel format document, and by using the web application developed as a result of the aforementioned previous research. The results demonstrated that a higher average result was obtained, for the team that used the implemented system, in all the established criteria. The interpretation of the results can be summarized with the fact that the system obtained a better perception by the team who used it than the results obtained by the team that used the evaluation template.KeywordsHuman-computer InteractionUsabilityHeuristic EvaluationProcessBPMN
Chapter
Full-text available
Based on the new collaborative work conditions and previous research, the need to establish a process of evaluation of the User Experience in its remote modality was evidenced. In this sense, it was considered to take an approach oriented to improving processes to elaborate a proposal. As part of the proposal, it was identified that it would be necessary to facilitate a workshop with the stakeholders involved in these processes to work collaboratively. As a case study, a virtual workshop was held with the stakeholders (UX research specialists, HCI specialists, UX designers, and Developers) to obtain feedback and pain points of the AS-IS process and proposed improvement opportunities for the TO-BE process. With the participation of 5 specialists, 19 improvements were identified for the tree testing process (8 improvements for the virtual process and 11 improvements for the face-to-face process) and 18 improvements for the face-to-face process of heuristic evaluation. Given their experience in evaluations and user experience design, the participants could prioritize the high-impact improvements for the evaluation processes analyzed. Because of the results achieved, the inputs to propose the TO-BE process, as well as the feedback received during the facilitation of the workshop, and as well at the end, the team can affirm that the whole objectives of the workshop were achieved. Besides, we can affirm that the facilitation of remote workshops allows for obtaining valuable information from key actors. Finally, we affirm as well that remote workshops allow collaboration and knowledge sharing between the participants, despite the limitations of distance, and their non-physical presence.KeywordsFacilitationCollaborative WorkRemote WorkshopUser ExperienceProcess ImprovementHuman-Computer Interaction
Article
Full-text available
Los sitios web son un componente clave en las organizaciones del competitivo y globalizado mundo actual, paralelamente su usabilidad se reviste de importancia en términos de satisfacer las necesidades y expectativas de sus usuarios. En este escenario se encuentran las universidades, de cuyos sitios web se espera que cubran diversas necesidades de información para los estudiantes, profesores, personal y exalumnos, que se forman o formaron en ellas. En esta investigación se presenta una evaluación heurística multicriterio basado en la norma ISO 9241-151, donde se valora la ergonomía de la interacción hombre-sistema, de las interfaces de usuario web en función de los criterios de diseño, presentación, búsqueda, diseño de contenido y navegación; que se aplicaron en la construcción de estos sitios de las 59 universidades de Ecuador. Finalmente, se discuten las recomendaciones para el diseño de estos sitios y se reflexiona sobre los problemas detectados en las pruebas de usabilidad. Palabras-clave: ISO 9241-151, HCI, evaluación heurística, sitios web.
Chapter
Full-text available
Heuristic evaluation belongs to the usability inspection methods and is considered one of the most popular methods since it allows to discover over 75% of the total usability problems involving only 3 to 5 usability experts, in comparison with user tests. However, certain problems and challenges have been identified at the time of their execution. In this study we present the results of conducting a Systematic Literature Review (SLR) to identify case studies, challenges, problems, and opportunities on the execution of heuristic evaluations in the context of a research for the automation and formalization of the process. For this SLR, we have employed the protocol proposed by Kitchenham and Charters. The research was carried out on September 7 of 2020 and retrieved a total of 167 studies of which 37 were selected for this review. The results show that the main challenges are related to the low suitability of the chosen set of heuristics and the low expertise of usability evaluators. Additionally, we have identified that exist very few software solutions that support and automate the process. Finally, we have found that there were many protocols to follow when applying a heuristic evaluation, like the definition of new usability heuristics for a given case of study. According to the results obtained, we can conclude that it is necessary to develop and validate a tool based on a formal protocol that supports and automate the heuristic evaluation process, that gives solutions to the challenges and opportunities identified in this research.
Chapter
Full-text available
Network growth through recent years reveals our continuous search for information. With the urge of consuming information, search engine popularity rises, becoming our browser’s primary function when typing in the navigation bar. The increasing information volume amplifies the demands of search refining with emphasizes on the concern about presenting data correctly and delivering answers to the users successfully. This paper presents a usability evaluation in the Brazilian Federal Court of Accounts (Tribunal de Contas da União - TCU) search engine for official documents. The need to update and centralize the search engines arised with the increasing technology quality in recent years. Despite good functionalities already implemented, the outdated interfaces with no common standards would no longer fit in today’s use. The urge to create a pattern in TCU systems brought the idea of centralizing different types of documents in only one search engine. Applying the inspection method proposed by Nielsen’s [1], it was possible to gather issues that were neglected in the development process. The results were outstanding: 49 issues found in a product that had already been released. Expecting user’s feedback to correct a software product is far from ideal, making necessary to apply inspection methods in evaluating the system’s usability. Severity ratings were applied in the inspection method, in order to categorize the issues and sort their solutions priorities. By summarizing their frequency, impact and persistence to rates, it was possible to combine those rates in a single severity rate, assigned to each issue. The work reported in this paper made possible the correction of several usability errors occurring in the system. Being a wide project, the amount of issues found, though seemed a large number, are proportionally not so huge: approximately one issue per screen is an acceptable result. The limitations of the studies were related to the amount of specialists involved, although the results indicate that a small group of evaluators is sufficient, specially when the projects, despite having many screens, has a lot of replicated components, where screens are not the same, but reuses interfaces modules. The contribution is primarily internal, by assisting in the deliver of a better product. But the study and reporting used in this paper allowed the authors to summarize the benefits of the heuristic inspection method, presenting the results in a real context, with explicit numbers and categorizing.
Article
Full-text available
Politeknik Statistika STIS has an integrated information system called SIPADU-STIS. One part of the system is a web-based application for students. Students use this system to support learning daily activities like check lecture schedule, grade, etc. This system was built by personnel and students. However, this new system was tested only by the black box method and tested to users with the System Usability Scale (SUS) questionnaire. The system has not been tested in terms of interface and interaction with users. Whereas the well-designed interface and user interaction can improve the quality of service to users (students). Therefore, researchers want to test the interface and interaction with users of the system. We conduct an observation of the application and evaluation of heuristics. The heuristic evaluation used is based on 10 principles by Nielsen. Respondents are chosen by students using convenience sampling. Based on the data obtained, all aspects still have shortcomings or problems, even though only 16.7% stated it. There is only one aspect that has a cosmetic problem that is the aspect of match between the system and the real world. Six aspects have minor usability problems, those are visibility of system status/feedback, consistency and standards, error prevention, recognition rather than recall, aesthetic and minimalist design, and help and documentation. There is one aspect that has a major usability problem, help user recognize, dialogue, and recovers from errors. Two of the last aspects that must be redesigned, are the aspect of user control and freedom and aspect of flexibility and efficiency of Use.
Chapter
The definition of user experience (UX) is broad and covers several aspects. The job of any programmer is very specific and demanding. He/she uses different systems or tools to carry out their programming tasks. We consider a programmer as a specific case of user, who employs programming environments and other software development artifacts. We therefore consider this particular kind of UX as Programmer eXperience (PX). Several authors have defined different aspects of PX, including, among others, language features, programming learning factors or programmer performance. Usability is a relevant aspect of UX, as well as an important aspect of programming environments. Heuristic evaluation is an inspection method that allows evaluating the usability of interactive software systems. We developed a set of heuristics following the methodology proposed by Quiñones et al. We defined a new set of 12 specific heuristics that incorporate concepts of UX and usability of programming environments. These heuristics have been validated following also that methodology. The results obtained in different effectiveness criteria were satisfactory. However, the set of heuristics could be further refined and validate in new scenarios or case studies.
Article
Today, a multitude of tourism websites are placing online services and information at users’ fingertips. Usability is an important factor in determining the success of such websites. This study adopts a user-centred approach to empirically assess the usability of tourism websites. Our results reveal serious usability weaknesses, including poor navigation, difficulty in completing tasks, overwhelming presentation of options, inconsistent naming of objects, and lack of warning messages. We leverage our findings to propose usability guidelines for the design of improved tourism websites.
Article
Chromatic Pupillometry represents a novel approach for the assessment of Inherited Retinal Diseases. A multi-centric pilot study with a sample of 40 paediatric patients has been designed, involving physicians and engineers. In this paper, the Electronic Medical Record, named ORÁO and specifically developed to collect ophthalmologic and pupillometric data, is presented. The platform is a cloud- based application, with a RESTful and three-tier architecture. These features make it available via web for the ophthalmologists involved in the project and working in two different University centres. The platform has been designed by the whole team and developed by the Department of Information Engineering of the University of Florence. The interfaces of the medical record have been evaluated in term of Usability, according to standards. An Heuristic Evaluation has been performed in the first stage of the design of the platform and the main severe usability issues have been addressed. The outcome of the project is a customized software solution. Moreover, the physicians have an excellent attitude toward the use of ORÁO and they perceive it as a useful tool to gather the data they collect with the aim of evaluating the overall progression of the pilot study.
Chapter
This paper illustrates the heuristic evaluation of a web-based tool for usability testing for Public Administrations called eGLU-box. eGLU-box is an online platform aiming at supporting practitioners in the process of designing usability tests, analyzing data, and helping step-by-step participants to complete assessment tasks. Web users of Public Administrations can report their perceived quality of experience by completing a library of questionnaires shown to them by eGLU-box at the end of the test. This work is part of a multi-step user experience (UX) evaluation methodology to assess the platform. The UX evaluation methodology of eGLU-box uses standard and bio-behavioural evaluation methods. This work shows the results of the heuristic evaluation of eGLU-box involving five human factors experts and 20 practitioners working in Italian Public Administrations. Findings show that most of the problems are rated as minor problems and related to Nielsen’s heuristic, “visibility of the system.” Only 9% of problems are rated as major problems. These major problems are related to the “problematic match between system and the real world” heuristic. Evaluators provided indications for improvements that will be applied for the next version of the platform.
Chapter
The concept of the Human-Computer Interaction - HCI - has contributed to create several tools and interaction devices that allow to perform the tasks of day to day with greater ease, efficiency and intelligence. It provides optimization of energy consumption, agility in mobility and accessibility, performing actions that promote the improvement of people’s quality of life. Digital Design course has as main objective the search of solutions for digital interfaces and focus on the value of the users. This article will present the result of the heuristic evaluation related to the ergonomic quality of graphical interfaces of websites, through the Ergonomic Aspects developed by Scapin and Bastien, carried out along the discipline of Ergonomics.