Content uploaded by Arturo Moquillaza
Author content
All content in this area was uploaded by Arturo Moquillaza on Nov 09, 2022
Content may be subject to copyright.
Selection and Modeling of a Formal Heuristic
Evaluation Process Through Comparative
Analysis
Adrian Lecaros(B), Arturo Moquillaza , Fiorella Falconi , Joel Aguirre ,
Alejandro Tapia , and Freddy Paz
Pontificia Universidad Católica del Perú, Av. Universitaria 1801, San Miguel, Lima 32, Lima,
Perú
{adrian.lecaros,ffalconit,aguirre.joel}@pucp.edu.pe,
{amoquillaza,a.tapiat,fpaz}@pucp.pe
Abstract. Due to the importance of usability, multiple usability evaluation meth-
ods have been proposed that help Human-Computer Interaction (HCI) specialists
determine whether the interfaces of a software product are usable, easy to use,
understandable, and attractive. Among these methods, the heuristic evaluation
proposed by Jakob Nielsen is the one that stands out. Although Nielsen offers
general guidelines for the execution of heuristic evaluations, very few authors of
different studies present a formal agreement or process on how the evaluations
should be carried out, which leads us to the problem of the absence of a com-
parative analysis that allows determining the most appropriate formal evaluation
process to carry out heuristic inspections. To complement, some proposals found
in the literature compare the results of the execution of heuristic evaluations, the
definition of new heuristics, qualification forms, and the formalization of the com-
plete process in 5 phases. Although these proposals contribute to the formalization
of the heuristic evaluation process, the literature review has not provided a com-
parative analysis that allows determining which is the most appropriate, which
could cause usability evaluators to interpret and carry out their own procedure,
which would lead to inaccuracies in the results and increase the probability of
improperly executing the inspection. The purpose of this study was to elaborate a
comparative table of the formal processes; this allowed the grouping of the vari-
ous studies found in the literature, to select a process, and thus, and to model by
a BPMN tool the whole process. Finally, this process modeled was validated by
expert judgement of HCI specialists.
Keywords: Human-Computer Interaction ·Usability ·Heuristic evaluation ·
Process ·BPMN
1 Introduction
Nowadays, usability is an essential aspect of the User Experience in the context of inter-
action between users and software products since it establishes a fundamental role in
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2022
M. M. Soares et al. (Eds.): HCII 2022, LNCS 13321, pp. 28–46, 2022.
https://doi.org/10.1007/978-3-031-05897-4_3
Selection and Modeling of a Formal Heuristic Evaluation Process 29
the use, acceptance, and interaction of users with those software products [1]. Due to
the importance of usability, multiple usability evaluation methods have been proposed
that help Human-Computer Interaction specialists determine whether the interfaces of
a software product are usable, easy to use, understandable, and attractive. Among these
methods, the heuristic evaluation proposed by Jakob Nielsen is the one that stands out
[2]. Although Nielsen offers general guidelines for the execution of heuristic evalua-
tions, very few authors of different studies present a formal agreement or process on
how the evaluations should be carried out, which leads us to the problem of the absence
of a comparative analysis that allows determining the most appropriate formal eval-
uation process to carry out heuristic inspections [3]. To complement, some proposals
found in the literature compare the results of the execution of heuristic evaluations, the
definition of new heuristics, qualification forms, and the formalization of the complete
process in 5 phases: planning, training, evaluation, discussion, and report. Although
these proposals contribute to the formalization of the heuristic evaluation process, the
literature review has not provided a comparative analysis that allows determining which
is the most appropriate, which could cause usability evaluators to interpret and carry
out their own procedure, which would lead to inaccuracies in the results and increase
the probability of improperly executing the inspection. As the objective of this study, a
comparative table of the formal processes was elaborated, and this allowed the grouping
of the various studies found in the systematic literature review, which have used proto-
cols or formal processes that contribute, to a certain extent, to the execution of heuristic
evaluations. This grouping made it possible to measure and compare the characteristics,
the degree of coverage, and the benefits of each one, and the result led to a complete
formal process. This result was achieved using a systematic review of the literature,
where the question sought to be answered: “what are the characteristics of the protocols
or formal processes that are being used to improve the performance of heuristic eval-
uations of software products?”. The comparison of the characteristics gave as a result
that the formal process that is part of the category “Formal processes with duly defined
steps” obtained the best results by providing a positive response for each of the criteria
when contributing to the entire evaluation process heuristics through duly defined steps
and by consolidating the different ways in which the evaluations are carried out into
one, allowing the interpretations of the usability evaluators to be reduced in how to carry
them out.
Additionally, the detailed diagram of the selected process wasmade with the objective
of modeling in detail the steps and tasks that must be followed to carry out the process
correctly and without room for different interpretations by the evaluators who use it.
In addition, the main process has been divided into the execution of 5 sub-processes
that correspond to the steps indicated in the selected formal process: (1) planning, (2)
training, (3) evaluation, (4) discussion and (5) report. This result was achieved using the
Bizagi BPMN Modeler tool [4], which allowed the design and modeling of the process
through the BPMN notation. This modeling consisted of 6 flows that are presented in this
research. All these deliverables were validated by expert judgement of HCI specialists.
This paper is structured as follows: In Sect. 2, we describe the main concepts belong-
ing to the Human-Computer Interaction area used in the study. In Sect. 3, we present the
result of the systematic literature review that answers the question of the characteristics
30 A. Lecaros et al.
of the protocols or formal processes that are being used to improve the performance
of heuristic evaluations of software products. In Sect. 4, we present the comparative
analysis and discussion of the protocols or formal processes and the selected process for
this research. In Sect. 5, we present the modeling of the selected formal process using
the Bizagi BPMN Modeler tool, and its validation. Finally, in Sect. 6, we present the
conclusions of the research and the future works to be done.
2 Background
In this section, we present the main concepts related to this work.
2.1 Usability
Usability, according to ISO 9241-210-2019 [5], is the “extend to which a system, product
or service can be used by specified users to achieve specified goals with effectiveness,
efficiency, and satisfaction in a specified context of use”.
Additionally, Jacob Nielsen [6] defines it as the evaluation of five attributes the user
interface of a system must have, which are the following:
•Learnability: The system should be easy to learn so that the user can perform some
tasks with the system as quickly as possible.
•Efficiency: The system should be efficient during its use to provide the highest level
of productivity possible.
•Memorability: The system should be easy to remember so that the casual user can use
it again after a period of leaving it, without the need to relearn how it works.
•Errors: The system should provide a low error rate so that users make as few errors
as possible and can quickly recover from them. Errors considered cata-strophic must
not occur.
•Satisfaction: The system should be pleasant to use so that users are subjectively
satisfied while using it.
2.2 Heuristic Evaluation
According to Andreas Holzinger [7], Heuristic Evaluation (HE) belongs to us-ability
inspection methods and is the most common informal method. For its exe-cution, it
requires usability experts who can identify if the dialogue elements or other interactive
software elements follow the established principles of usability.
According to Jacob Nielsen [6], the heuristic evaluation allows the inspection of
what is good and bad in the interface of a system, which could be done through one’s
own opinion or, ideally, using well-defined guidelines. The author also maintains that the
main goal of the evaluation is to find usability problems in the design of an interface that
is carried out through a group of evaluators who will inspect and judge it through usability
principles called heuristics. Additionally, a single evaluator can find only 35% of the
usability problems in an interface; however, each evaluator usually encounters different
types of problems; for this reason, he recommends the participation of 3 to 5 evaluators
to obtain the best cost-benefit ratio. These evaluations are performed singularly, and
then, upon completion, the results are compared for overall usability analysis.
Selection and Modeling of a Formal Heuristic Evaluation Process 31
2.3 Systematic Literature Review
According to Barbara Kitchenham and Stuart Charters [8], Systematic Literature Review
(SLR) consists of identifying, evaluating, and interpreting the most relevant available
studies and answering review questions about an area or phenomenon of interest. Addi-
tionally, for this study, we will use the formal protocol proposed by Kitchenham that
consists of the review being carried out in three phases: (1) planning, (2) conducting and
(3) reporting.
Planning consists of defining stages to identify the need for the review, as well
as the specification of research questions, developing and evaluating review protocols.
Additionally, the conduction will allow the identification of the review, selection of
primary studies, as well as the extraction, monitoring and synthesis of data. Finally, the
reporting phase will facilitate the specification of propagation mechanisms, as well as
provide formatting and review of the main report generated.
2.4 Heuristic Evaluation Formal Process
According to Freddy Paz [9], it consists of a framework that has a base in the analysis of
case studies that will provide a structured way to execute of the heuristic evaluation to
reduce the different interpretations that arise now of its use. An example of the formal
process is the one defined by the author, who establishes five phases for its execution:
(1) planning, (2) training, (3) evaluation, (4) discussion and (5) report.
3 Characteristics of Protocols or Formal Processes Used to Improve
the Performance of Heuristic Evaluations
According to our previous research’s systematic literature review [3], the answer to the
research question of “what are the characteristics of the formal protocols or processes
that are being used to improve the performance of heuristic evaluations in software
products?” was divided by grouping the information obtained by 33 studies into 6 main
protocols. Table 1shows the selected primary studies to answer the review question.
Table 1. Selected primary studies
ID Study Quote Search engine
S01 Usability Problem Areas on Key International and Key Arab
E-commerce Websites
[10]Scopus
S02 Usability evaluation of web-based interfaces for Type2
Diabetes Mellitus
[11]Scopus
S03 Heuristic Evaluations of Cultural Heritage Websites [12]Scopus
S04 PROMETHEUS: Procedural Methodology for Developing
Heuristics of Usability
[13]Scopus
S05 A formal protocol to conduct usability heuristic evaluations in
the context of the software development process
[14]Scopus
(continued)
32 A. Lecaros et al.
Table 1. (continued)
ID Study Quote Search engine
S08 Observation and heuristics evaluation of student web-based
application of SIPADU-STIS
[15]Scopus
S09 Usabilidad en sitios web oficiales de las universidades del
ecuador
[16]Scopus
S10 The E-health Literacy Demands of Australia’s My Health
Record: A Heuristic Evaluation of Usability
[17]Scopus
S11 Experimental validation of a set of cultural-oriented usability
heuristics: e-Commerce websites evaluation
[18]Scopus
S12 A heuristic evaluation on the usability of health information
websites
[19]Scopus
S13 Websites with multimedia content: A heuristic evaluation of
the medical/anatomical museums
[20]Scopus
S14 A Heuristic Evaluation for Deaf Web User Experience
(HE4DWUX)
[21]Scopus
S15 A comparative study of video content user interfaces based on
heuristic evaluation
[22]Scopus
S16 Heuristic evaluation for Virtual Museum on smartphone [23]Scopus
S17 A User-Centered Design for Redesigning E-Government
Website in Public Health Sector
[24]Scopus
S18 Developing Usability Heuristics: A Formal or Informal
Process?
[25]Scopus
S20 Método para la evaluación de usabilidad de sitios web
transaccionales basado en el proceso de inspección heurística
[9]Alicia Concytec
S21 Usability testing of conferences websites: A case study of
practical teaching
[26]Scopus
S22 Evaluating the usability of the information architecture of
academic library websites
[27]Scopus
S23 A perception study of a new set of usability heuristics for
transactional websites
[28]Scopus
S24 Comparing the effectiveness and accuracy of new usability
heuristics
[29]Scopus
S26 Quantifying the usability through a variant of the traditional
heuristic evaluation process
[30]Scopus
S27 University students’ heuristic usability inspection of the
national library of Turkey website
[31]Scopus
S28 Usability heuristics evaluation in search engine [32]Scopus
S29 A collaborative RESTful cloud-based tool for management of
chromatic pupillometry in a clinical trial
[33]Scopus
(continued)
Selection and Modeling of a Formal Heuristic Evaluation Process 33
Table 1. (continued)
ID Study Quote Search engine
S30 Heuristic Evaluation of eGLU-Box: A Semi-automatic
Usability Evaluation Tool for Public Administrations
[34]Scopus
S31 The Relationship of the Studies of Ergonomic and Human
Computer Interfaces – A Case Study of Graphical Interfaces in
E-Commerce Websites
[35]Scopus
S32 Evaluation of usability heuristics for transactional web sites: A
comparative study
[36]Scopus
S33 Heuristics for grid and typography evaluation of art magazines
websites
[37]Scopus
S34 Usability of tourism websites: a case study of heuristic
evaluation
[38]Scopus
S35 Programmer experience: a set of heuristics for programming
environments
[39]Scopus
S36 Exploring the usability of the central library websites of
medical sciences universities
[40]Scopus
S37 Heuristic Usability Evaluation of University of Hong Kong
Libraries’ Mobile Website
[41]Scopus
Also, Table 2shows the protocols or formal processes and the studies in which they
were found.
Table 2. Studies that include protocols or formal processes used for heuristic evaluations
Protocol or formal process Research studies # Studies
Definition of new heuristics S01, S03, S04, S11, S14, S18, S22, S23,
S24, S29, S31, S32, S33, S35
14
Nielsen derived evaluation items S12, S13, S15, S21, S27, S28, S30, S34,
S36, S37
10
Comparison and grouping between
evaluations
S01, S08, S10, S17, S18, S23, S24, S32 8
Qualification methods S02, S16, S34 3
Methodologies applied to heuristic
evaluation
S09, S22, S26 3
Formal processes with properly
defined steps
S05, S20 2
34 A. Lecaros et al.
Descriptions of formal protocols or processes and their benefits are presented below.
1. Definition of new heuristics: Definition of new heuristics is a fundamental protocol
applied in the execution of heuristic evaluations since, despite having the Nielsen
heuristics, these will not always be able to cover what is required for the usability
evaluation of a particular software [S01 & S03], as well as relate to the object of study
[S04]. For this reason, the benefits obtained with the definition of new heuristics are
the best representation of the case study [S01] that could not be performed using only
Nielsen heuristics [S03] and obtaining usability problems that are desired to be found
in a particular system [S04, S11, S14, S22, S23, S24, S29, S31, S32, S33 & S35].
Additionally, methodologies applied to the generation of new heuristics [S18] such
as PROMETHEUS are presented, which will allow dividing the process into stages
that validate that the heuristics are being properly constructed, iterating through a
refinement step if necessary [S04].
2. Nielsen derived evaluation items: One of the protocols used for the evaluation that
is derived from Nielsen is the use of its ten heuristics in conjunction with the eight
Schneiderman golden rules for grouping them into evaluation elements [S12]. Addi-
tionally, in another protocol, phases are used for the application of the evaluation,
which consist of the initial navigation on the website to judge the flow, perception
and dynamics of the interaction, and then, in the second phase, use the metrics of
Nielsen usability heuristics [S13]. Other studies show that Nielsen heuristics could
be used in conjunction with metrics to enrich the collected data [S27], with attributes
defined in ISO / IEC 25023 [S28] and in conjunction with case study-specific heuris-
tics [S34]. Finally, the ten Nielsen heuristics could also be applied by themselves to
perform usability evaluations [S15, S21, S30, S36 & S37].
3. Comparison and grouping between evaluations: Comparison of results of the exe-
cution of heuristic evaluations between the country to be evaluated versus key inter-
national pages allows providing improvement indicators and recommendations of
good practices [S01]. Likewise, evaluation questions grouping allows comparisons
between the generated groups to be made, in order to obtain more precise answers
[S08]. Additionally, dividing heuristic evaluations into groups by used heuristics
allows comparisons to be made of how many usability problems have been found
with each one [S10]. Complementarily, if a heuristic evaluation is made to a website
and then an additional one is made to the improved site as a result of the first evalua-
tion, it will be possible to quantitatively compare how much the score has improved
[S17]. Finally, comparing test execution with Nielsen heuristics in contrast to newly
designed heuristics allows the measurement of their effectiveness [S18, S23, S24 &
S32].
4. Qualification methods: The use of Nielsen severity rating and Olson’s ease of fix will
allow providing a quantitative assessment according to the severity of the usability
problem encountered and how easy or difficult it will be to fix it [S02]. Likewise,
another form of rating is the use of a heuristic evaluation questionnaire that consists
of assigning a score of five points on the Likert scale for the evaluation of prototypes
[S16], as well as adding structured sub-elements to the defined heuristics whose
questions can be used by evaluators. [S34].
Selection and Modeling of a Formal Heuristic Evaluation Process 35
5. Methodologies applied to heuristic evaluation: The protocols oriented to heuristic
evaluation methodologies show the proposal for the use of a heuristic method based
on ISO-9241–151 standard that consists of the execution of a heuristic evaluation
by applying indicators divided into criteria that allow each indicator to be assessed,
which facilitates the evaluation of results by the criterion as well as in general [S09].
In addition, a framework is defined for the evaluation of dialogue elements in web
pages so that the evaluators consider the same previously defined elements to be
evaluated, as well as a definition of a workflow that allows its applicability in other
types of software [S22]. Finally, the definition of a quantitative evaluation procedure
of the results of a heuristic evaluation is proposed to obtain consistent and quantifiable
results without the need for user participation [S26].
6. Formal processes with properly defined steps: The heuristic inspection process can
be fully formalized through five duly defined steps: (1) planning, (2) training, (3)
evaluation, (4) discussion, and (5) report [S05 & S20]. In this way, this formalization
consolidates the different ways in which evaluations are carried out into a single one,
allowing interpretations on how to carry them out to be reduced [S05 & S20].
4 Comparative Analysis and Discussion
After grouping the studies according to their main characteristics, a comparative table
was made that allowed the comparison of said protocols or formal processes, in general
criteria such as the level of coverage, which consisted of qualifying the contribution
to the heuristic evaluation process to classify them in one of the following levels as
appropriate: Low, Medium and High. In addition, the benefits found and the types of
software evaluated by each one were shown. Finally, the definition of new heuristics
protocol was not taken into consideration, since, although it is a formal process, the
direct contribution to the heuristic evaluation process is not as significant as it only
results in new heuristics for inspection. Table 3shows the characteristics mentioned.
Table 3. Characteristics of the protocols or formal processes
Protocol or formal
process
Coverage level of the
heuristic evaluation
process
Found benefits Application
Nielsen derived
evaluation items
Medium: They contribute
mainly to the execution of
the heuristic evaluation by
using the Nielsen
heuristics to carry out the
evaluation together with
additional elements that
allow improving the
inspection
(1) They allow for
quantitative results when
using metrics related to
Nielsen’s heuristics
(2) They allow the
heuristic evaluation to be
carried out, covering the
demands proposed by
conventional professional
standards
(3) They allow adding
additional heuristics in
conjunction with the
Nielsen heuristics
Health information
websites, medical
museums, conferences,
libraries, and tourism, as
well as video content
interfaces
(continued)
36 A. Lecaros et al.
Table 3. (continued)
Protocol or formal
process
Coverage level of the
heuristic evaluation
process
Found benefits Application
Comparison and
grouping between
evaluations
Low: They contribute
mainly to the comparison
of results of heuristic
evaluations by proposing
more than one form of
execution to finally
evaluate which is the best
(1) They allow usability
evaluation results to be
compared to generate
good practice
recommendations
(2) They allow us to
compare how much the
results of a new system
have improved compared
to the initial one
(3) They allow to measure
the effectiveness between
groups of heuristics
E-commerce, student,
health, and transactional
websites
Qualification
methods
Medium: They contribute
mainly to the execution of
the heuristic evaluation by
providing protocols that
allow formalizing the
qualification during the
inspection
(1) They allow
determining and
prioritizing which are the
usability problems that
must be solved as soon as
possible
(2) They allow the
structuring of questions
for the evaluators and that
their answers are acquired
quickly
Diabetes screening and
tourism websites as well
as virtual museum mobile
apps
Methodologies
applied to heuristic
evaluation
Medium: They contribute
mainly to the execution of
the heuristic evaluation
through indicators and the
inspection of dialogue
elements. In addition, they
contribute to the results as
they are considered
procedures for obtaining
quantitative results
(1) They allow
assessments to be given
through indicators that
will facilitate the
evaluation of the result by
criteria and in total
(2) They allow all
evaluations to consider the
same navigation elements
(3) They allow obtaining
consistent and quantifiable
results of the evaluation
carried out
University and library
websites
Formal processes
with properly
defined steps
High: They contribute to
the entire heuristic
evaluation process through
duly defined steps:
planning, training,
evaluation, discussion and
reporting
Consolidates the various
ways in which evaluations
are carried out into one,
allowing the
interpretations of usability
evaluators in how to carry
them out to be reduced
Transactional websites
Selection and Modeling of a Formal Heuristic Evaluation Process 37
Additionally, to carry out the comparative analysis, a comparative analysis matrix
was prepared and it allowed the characteristics of each process to be compared through
the following criteria:
•Applied contexts: It is verified if it has been applied in an academic or industrial
context or in both.
•Types of software evaluated: These are the types of software present in the various
studies
•Does it divide the evaluation into duly defined steps?: It is verified if the studies present
the protocol or formal process with indications of steps that must be followed to carry
out the inspection.
•Is it a complement or a new proposal to the evaluation process proposed by Nielsen?:
It is verified if the way of carrying out the evaluation is an adaptation of the process
proposed by Nielsen that includes additional characteristics, or if a new process has
been proposed.
•Does it include previous training for evaluators?: It is verified if the previous training
for evaluators is included as part of the protocol or formal process.
•The validation of the submitted proposal is evidenced: It is verified if the protocol or
formal process used in the studies has been validated.
•Result of the evidence: Detail of the evidence of the validation of the proposal, if any.
•Coverage level of the Heuristic Evaluation: Measurement of the coverage of the
evaluation, which can be (1) Low, (2) Medium or partial, (3) High or total.
•Description of coverage: Justification of the level of coverage of the evaluation.
•Benefits found: Detail of the benefits found by the use of the protocol or formal
process.
Table 4shows a summary of the comparative analysis matrix with the result of its
five measurable criteria that were considered because they were the ones that allowed a
comparative analysis between the protocols and formal processes to give a measurable
answer in how much was contributed in the heuristic evaluation process.
Table 4. Summary of the comparative analysis matrix
Protocol or
formal process
Does it
divide the
evaluation
into duly
defined
steps?
Is it a
complement or
a new proposal
to the evaluation
process
proposed by
Nielsen?
Does it
include prior
training for
evaluators?
The
validation of
the submitted
proposal is
evidenced
Coverage
level of the
Heuristic
Evaluation
Nielsen derived
evaluation items
No Complement No No Medium or
partial
Comparison and
grouping
between
evaluations
No Complement No No Low
(continued)
38 A. Lecaros et al.
Table 4. (continued)
Protocol or
formal process
Does it
divide the
evaluation
into duly
defined
steps?
Is it a
complement or
a new proposal
to the evaluation
process
proposed by
Nielsen?
Does it
include prior
training for
evaluators?
The
validation of
the submitted
proposal is
evidenced
Coverage
level of the
Heuristic
Evaluation
Qualification
methods
Yes (one
study)
Complement No No Medium or
partial
Methodologies
applied to
heuristic
evaluation
No Complement No No Medium or
partial
Formal
processes with
properly defined
steps
Yes New proposal Yes Yes High or
total
The comparison of the characteristics gave as a result that the formal process that
is part of the category “Formal processes with properly defined steps” obtained the best
results by providing a positive response for each of the criteria by contributing to the
entire heuristic evaluation process through duly defined steps and by consolidating the
different ways in which evaluations are carried out in one, allowing the interpretations of
usability evaluators on how to carry them out to be reduced. For this reason, this formal
process was selected as the basis of this research project.
5 Selected Formal Process BPMN Model
The detailed diagram of the selected process was made with the aim of modeling in
detail the steps and tasks that must be followed to carry out the process properly, and
that there is no room for different interpretations by the evaluators who use it. In addition,
the process has been divided into the main flow that consists of the execution of 5 sub-
processes that correspond to the steps indicated in the selected formal process [9]: (1)
planning, (2) training, (3) evaluation, (4) discussion and (5) report.
This result was achieved using the Bizagi BPMN Modeler tool, which allowed the
design and modeling of the process through the BPMN notation which consisted of 6
flows that are detailed below:
•General: Flow in which all the threads that must be followed to carry out the process
in its entirety are described. Figure 1shows the BPMN diagram of said process flow.
Selection and Modeling of a Formal Heuristic Evaluation Process 39
Fig. 1. General process flow diagram
•Planning: Flow that defines the initial phase related to the preparation of the heuristic
evaluation. In this phase, the evaluation manager carries out the preliminary steps
to carry out the evaluation. Likewise, the objective and scope of the evaluation, the
software product to be evaluated, the profiles and the number of evaluators that will
make up the team, the set of heuristics and the evaluation template to be used are
defined. Once the above has been defined, the professionals who will form part of
the evaluation team are recruited. Figure 2shows the BPMN diagram of said process
flow.
Fig. 2. Planning flow diagram
40 A. Lecaros et al.
•Training: Flow that defines the training phase for evaluators with little or no experi-
ence. In this phase, the evaluation manager prepares a training session for the eval-
uators who belong to the mentioned criteria. Once the training has been completed,
the manager informs the evaluation team of the goals, objectives, framework criteria
and a general idea of the software product, as well as the type of evaluation to be
carried out, so that the team can then carry out a free exploration of system interfaces.
Figure 3shows the BPMN diagram of said process flow.
Fig. 3. Training flow diagram
•Evaluation: Flow that defines the execution phase of the heuristic evaluation of the
interfaces of the selected software product. In this phase, the evaluation team examines
the interfaces and depending on the experience of the evaluator; the flow will be
continued in one of two ways:
(1) If the evaluator has experience, he should carry out the evaluation by identifying
the usability problems in the interfaces by taking screenshots of the problems
found, to later associate them with the heuristic principles that were not fulfilled
and document them in the evaluation template.
(2) If the evaluator has little or no experience, he should first familiarize himself
with the heuristic principles and then proceed with the evaluation following one
of two options: (a) Reviewing the complete list of heuristics to identify problems
according to what is learned and remember at the time, taking screenshots and
documenting your findings in the assessment template. (b) Identifying problems,
taking screenshots, and documenting in the evaluation template considering only
one heuristic at a time.
Selection and Modeling of a Formal Heuristic Evaluation Process 41
Figure 4shows the BPMN diagram of said process flow.
Fig. 4. Evaluation flow diagram
•Discussion: Flow that defines the elaboration phase of the final list of usability prob-
lems by determining if the problem has been found by more than one evaluator or if it
really refers to a usability problem. In this phase, the evaluation manager verbalizes
each of the problems so that the evaluator who identified the problem can provide
more details and evidence of each one. Then, the team determines if the identified
aspect really refers to a usability problem, if it is not considered so, the evaluator will
determine if it still considers that it is a usability problem, otherwise, it is discarded.
If it is a relevant aspect and the problem was identified by more than one evaluator,
it must be determined whether it is the same incident or different usability aspects.
If it is the same incidence, a definition and description of the problem is prepared
collaboratively, otherwise, it is classified as a usability problem to be discussed later.
Then, if the identified aspect was classified as a usability problem, it must be
determined if it clearly expresses the incident to which it refers. If not, a definition
and description of the identified problem is prepared. Finally, this problem is included
in the final list.
42 A. Lecaros et al.
Figure 5shows the BPMN diagram of said process flow.
Fig. 5. Discussion flow diagram
•Report: Flow that defines the assignment of a severity and frequency value to each
problem, to determine which are the problems that should be solved most urgently.
In this phase, the assessment manager sends the consolidated list to each member of
the team. Then, each evaluator must rate the severity and frequency of each problem
and then average the scores assigned individually and calculate the standard deviation
together with the ratings made by the entire team. In case the standard deviation
value is high, the reasons for assigning the severity and frequency values should be
discussed, to later reach a consensus on the criticality.
Finally, the evaluation team must offer possible solutions for each of the problems,
also highlighting the positive aspects of the proposed interfaces, to prepare a final report
that describes all the results of the heuristic evaluation process. Figure 6shows the
BPMN diagram of said process flow.
For the validation of the modeling realized, an HCI specialist was interviewed who
validated 100% of the result by confirming that all the criteria were successfully met, as
presented in the interview. These criteria were the following:
•The detailed diagram of the formal heuristic evaluation process selected in BPMN
format was presented using the Bizagi BPMN Modeler tool.
•The flow of the process was clearly understood with no room for it to be interpreted
in more than one way.
•The process was correctly modeled using standard BPMN notations for process
modeling.
•The process was adequately explained, and the questions asked about it were clearly
answered.
Selection and Modeling of a Formal Heuristic Evaluation Process 43
Fig. 6. Report flow diagram
6 Conclusions and Future Works
The objective of this research was to select and model a formal process of heuristic
evaluation applicable to the usability inspection process of software products through a
comparative analysis, whose results were the following: (1) a comparative table of the
formal processes documented in the literature to carry out a heuristic evaluation and (2)
a detailed diagram of the selected process modeled in BPMN.
The first result of this objective was achieved by developing acomparative table with
the formal processes obtained as part of the systematic review of the literature, where five
protocols or formal processes were evidenced that were compared through 5 measurable
criteria. After analyzing all the process proposals, it was possible to conclude that the
formal process “Formal processes with duly defined steps” was the most appropriate,
because it fully covered the heuristic evaluation process, complied with dividing the
evaluation into duly defined steps and because it included prior training for evaluators
unlike the others.
Additionally, it was presented as a new proposal to the evaluation process proposed
by Nielsen, where its validation was evidenced through case studies that allowed to find
a greater number of severe and critical problems, as well as that the evaluators tend to
make fewer errors compared to the Nielsen process.
The second result of this objective was achieved by modeling the process through the
BPMN notation, where the flow that must be followed to carry out the process properly
was presented in detail. The process modeled was validated by expert judgement.
By complying with the two mentioned results, it is concluded that the objective
has been met, since a solution has been given to the problem caused by the absence
of a comparative analysis of the formal processes documented in the literature, having
prepared a table with criteria that allowed to quantitatively and qualitatively evaluate
the formal processes found in the systematic review of the literature, and to be able to
recommend the process with the best indicators after reviewing the results obtained.
44 A. Lecaros et al.
For future works, the formal process selected in this study will be taken into con-
sideration as the basis for developing a software tool that allows the automation of the
heuristic evaluation process, since, in this way, the well-defined steps could be repre-
sented in said system as support to usability evaluators in their heuristic inspections
regardless of their experience level.
Acknowledgments. This work is part of the research project “Virtualización del proceso de eval-
uación de experiencia de usuario de productos de software para escenarios de no presencialidad”
(virtualization of the user experience evaluation process of software products for non-presential
scenarios), developed by HCI-DUXAIT research group. HCI-DUXAIT is a research group that
belongs to the PUCP (Pontificia Universidad Católica del Perú).
This work was funded by the Dirección de Fomento a la Investigación at the PUCP through
grant 2021-C-0023.
References
1. Li, Y., Liu, C.: User experience research and analysis based on usability testing methods.
In: Zhao, P., Ye, Z., Xu, M., Yang, Li., Zhang, L., Zhu, R. (eds.) Advances in Graphic
Communication, Printing and Packaging Technology and Materials. LNEE, vol. 754, pp. 263–
267. Springer, Singapore (2021). https://doi.org/10.1007/978-981-16-0503-1_39
2. Nielsen, J.: Usability inspection methods. In: Conference Companion on Human Factors in
Computing Systems. pp. 413–414 (1994). DOI: https://doi.org/10.1145/259963.260531
3. Lecaros, A., Paz, F., Moquillaza, A.: Challenges and opportunities on the application of heuris-
tic evaluations: a systematic literature review. In: Soares, M.M., Rosenzweig, E., Marcus, A.
(eds.) HCII 2021. LNCS, vol. 12779, pp. 242–261. Springer, Cham (2021). https://doi.org/
10.1007/978-3-030-78221-4_17
4. Goetz, M.: Modeling Workflow Patterns through a Control-flow perspective using BPMN
and the BPM Modeler BizAgi. Bpmn. (2009)
5. International Organization for Standardization: ISO 9241–210–2019: (2019)
6. Nielsen, J.: Usability Engineering. Morgan Kaufmann Publishers Inc., San Francisco, CA,
USA (1993)
7. Holzinger, A.: Usability engineering methods for software developers. Commun. ACM. 48,
71–74 (2005). https://doi.org/10.1145/1039539.1039541
8. Kitchenham, B., Charters, S.: Guidelines for performing Systematic Literature Reviews in
Software Engineering (2007)
9. Paz Espinoza, F.A.: Método para la evaluación de usabilidad de sitios web transaccionales
basado en el proceso de inspección heurística [Doctoral Thesis]. Pontif. Univ. Católica del
Perú. 275 (2018). http://hdl.handle.net/20.500.12404/9903
10. Hasan, L., Morris, A.: Usability problem areas on key international and key arab e-commerce
websites. J. Internet Commer. 16, 80–103 (2017). https://doi.org/10.1080/15332861.2017.
1281706
11. Davis, D., Jiang, S.: Usability evaluation of web-based interfaces for Type2 Diabetes Mellitus.
In: IEOM 2015 - 5th International Conference Industrial Engineering Operations Management
Proceeding (2015). https://doi.org/10.1109/IEOM.2015.7093713
12. Lam, D., Sajjanhar, A.: Heuristic evaluations of cultural heritage websites. In: 2018 Interna-
tional Conference Digital Image Computing Techniques Appllications DICTA 2018 (2019).
https://doi.org/10.1109/DICTA.2018.8615847
Selection and Modeling of a Formal Heuristic Evaluation Process 45
13. Jiménez, C., Cid, H.A., Figueroa, I.: PROMETHEUS: procedural methodology for developing
heuristics of usability. IEEE Lat. Am. Trans. 15, 541–549 (2017). https://doi.org/10.1109/
TLA.2017.7867606
14. Paz, F., Paz, F.A., Pow-Sang, J.A., Collazos, C.: A formal protocol to conduct usability
heuristic evaluations in the context of the software development process. Int. J. Eng. Technol.
7(2.28), 10–19 (2018). https://doi.org/10.14419/ijet.v7i2.28.12874
15. Maghfiroh, L.R.: Observation and heuristics evaluation of student web-based application of
SIPADU-STIS. J. Phys. Conf. Ser. 1511, 0–10 (2020) https://doi.org/10.1088/1742-6596/
1511/1/012019
16. Pincay-Ponce, J., Caicedo-Ávila, V., Herrera-Tapia, J., Delgado-Muentes, W., Delgado-
Franco, P.: Usabilidad en sitios web oficiales de las universidades del ecuador. Rev. Ibérica
Sist. e Tecnol. Informação. 106–120 (2020)
17. Walsh, L., Hemsley, B., Allan, M., Adams, N., Balandin, S., Georgiou, A., Higgins, I.,
McCarthy, S., Hill, S.: The e-health literacy demands of Australia’s my health record: a
heuristic evaluation of usability. Perspect. Heal. Inf. Manag. 14 (2017)
18. Díaz, J., Rusu, C., Collazos, C.A.: Experimental validation of a set of cultural-oriented usabil-
ity heuristics: e-Commerce websites evaluation. Comput. Stand. Interfaces. 50, 160–178
(2017). https://doi.org/10.1016/j.csi.2016.09.013
19. Chen, Y.N., Hwang, S.L.: A heuristic evaluation on the usability of health information web-
sites. In: Bridging Research and Good Practices towards Patient Welfare - Proceedings of the
4th International Conference on HealthCare Systems Ergonomics and Patient Safety, HEPS
2014, pp. 109–116 (2015)
20. Kiourexidou, M., Antonopoulos, N., Kiourexidou, E., Piagkou, M., Kotsakis, R., Natsis, K.:
Websiteswith multimedia content: A heuristic evaluation of the medical/anatomical museums.
Multimodal Technol. Interact. 3(2), 42 (2019). https://doi.org/10.3390/mti3020042
21. Yeratziotis, A., Zaphiris, P.: A Heuristic Evaluation for Deaf Web User Experience
(HE4DWUX). Int. J. Hum. Comput. Interact. 34, 195–217 (2018). https://doi.org/10.1080/
10447318.2017.1339940
22. Eliseo, M.A., Casac, B.S., Gentil, G.R.: A comparative study of video content user inter-
faces based on heuristic evaluation. In: Iberian Conference Information Systems Technologies
CISTI, pp. 1-6 (2017). https://doi.org/10.23919/CISTI.2017.7975820
23. Motlagh Tehrani, S.E., Zainuddin, N.M.M., Takavar, T.: Heuristic evaluation for Virtual
Museum on smartphone. In: Proceeding - 2014 3rd International Conference User Science
Engineering Experimental Engineering Engag. i-USEr 2014, pp. 227–231 (2015) https://doi.
org/10.1109/IUSER.2014.7002707
24. Puspitasari, I., Indah Cahyani, D.: Taufik:a user-centered design for redesigning e-government
website in public health sector. In: Proceeding - 2018 International Seminar Application
Technology Information Communication Creat. Technology Human Life, iSemantic 2018,
pp. 219–224 (2018) https://doi.org/10.1109/ISEMANTIC.2018.8549726
25. Quinones, D., Rusu, C., Roncagliolo, S., Rusu, V., Collazos, C.A.: Developing usability
heuristics: a formal or informal process? IEEE Lat. Am. Trans. 14, 3400–3409 (2016). https://
doi.org/10.1109/TLA.2016.7587648
26. Halaweh, M.: Usability testing of conferences websites: a case study of practical teaching. In:
Uden, L., Hadzima, B., Ting, I.-H. (eds.) KMO 2018. CCIS, vol. 877, pp. 380–389. Springer,
Cham (2018). https://doi.org/10.1007/978-3-319-95204-8_32
27. Silvis, I.M., Bothma, T.J.D., de Beer, K.J.W.: Evaluating the usability of the information
architecture of academic library websites. Libr. Hi Tech. 37, 566–590 (2019). https://doi.org/
10.1108/LHT-07-2017-0151
28. Paz, F., Paz, F.A., Arenas, J.J., Rosas, C.: A Perception study of a new set of usability heuristics
for transactional web sites. In: Karwowski, W., Ahram, T. (eds.) IHSI 2018. AISC, vol. 722,
pp. 620–625. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-73888-8_96
46 A. Lecaros et al.
29. Paz, F., Paz, F.A., Pow-Sang, J.A.: Comparing the effectiveness and accuracy of new usability
heuristics. Adv. Intell. Syst. Comput. 497, 163–175 (2017). https://doi.org/10.1007/978-3-
319-41956-5_16
30. Paz, F., Paz, F.A., Sánchez, M., Moquillaza, A., Collantes, L.: Quantifying the usability
through a variant of the traditional heuristic evaluation process. In: Marcus, A., Wang, W.
(eds.) DUXU 2018. LNCS, vol. 10918, pp. 496–508. Springer, Cham (2018). https://doi.org/
10.1007/978-3-319-91797-9_36
31. Inal, Y.: University students’ heuristic usability ınspection of the national library of Turkey
website. Aslib J. Inf. Manag. 70, 66–77 (2018). https://doi.org/10.1108/AJIM-09-2017-0216
32. dos Santos Pergentino, A.C., Canedo, E.D., Lima, F., de Mendonça, F.L.L.: Usability heuris-
tics evaluation in search engine. In: Marcus, A., Rosenzweig, E. (eds.) HCII 2020. LNCS,
vol. 12200, pp. 351–369. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-49713-
2_25
33. Iadanza, E., Fabbri, R., Luschi, A., Melillo, P., Simonelli, F.: A collaborative RESTful cloud-
based tool for management of chromatic pupillometry in a clinical trial. Heal. Technol. 10(1),
25–38 (2019). https://doi.org/10.1007/s12553-019-00362-z
34. Federici, S., et al.: Heuristic evaluation of eglu-box: a semi-automatic usability evaluation
tool for public administrations. In: Kurosu, M. (ed.) HCII 2019. LNCS, vol. 11566, pp. 75–86.
Springer, Cham (2019). https://doi.org/10.1007/978-3-030-22646-6_6
35. de Menezes, M., Falco, M.: The relationship of the studies of ergonomic and human computer
interfaces – a case study of graphical interfaces in e-commerce websites. In: Marcus, A., Wang,
W. (eds.) HCII 2019. LNCS, vol. 11586, pp. 474–484. Springer, Cham (2019). https://doi.
org/10.1007/978-3-030-23535-2_35
36. Paz, F., Paz, F.A., Pow-Sang, J.A.: Evaluation of usability heuristics for transactionalweb
sites: a comparative study. Adv. Intell. Syst. Comput. 448, 1063–1073 (2016) https://doi.org/
10.1007/978-3-319-32467-8_92
37. Retore, A.P., Guimarães, C., Leite, M.K.: Heuristics for grid and typography evaluation of art
magazines websites. In: Kurosu, M. (ed.) HCI 2016. LNCS, vol. 9731, pp. 408–416. Springer,
Cham (2016). https://doi.org/10.1007/978-3-319-39510-4_38
38. Huang, Z.: Usability of tourism websites: a case study of heuristic evaluation. New Rev.
Hypermedia Multimed. 26, 55–91 (2020). https://doi.org/10.1080/13614568.2020.1771436
39. Morales, J., Rusu, C., Botella, F., Quiñones, D.: Programmer experience: a set of heuristics
for programming environments. In: Meiselwitz, G. (ed.) HCII 2020. LNCS, vol. 12195,
pp. 205–216. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-49576-3_15
40. Okhovati, M., Karami, F., Khajouei, R.: Exploring the usability of the central library websites
of medical sciences universities. J. Librariansh. Inf. Sci. 49, 246–255 (2017). https://doi.org/
10.1177/0961000616650932
41. Fung, R.H.Y., Chiu, D.K.W., Ko, E.H.T., Ho, K.K.W., Lo, P.: Heuristic usability evaluation of
university of hong kong libraries’ mobile website. J. Acad. Librariansh. 42, 581–594 (2016).
https://doi.org/10.1016/j.acalib.2016.06.004