Content uploaded by Kumari Sarita
Author content
All content in this area was uploaded by Kumari Sarita on Nov 05, 2022
Content may be subject to copyright.
Accessibility and Performance Evaluation
of Healthcare and E-Learning Sites in India:
A Comparative Study Using TAW and GTMetrix
Kumari Sarita1(B), Parminder Kaur1, and Satinder Kaur2
1Department of Computer Science, Guru Nanak Dev University, Amritsar, Punjab, India
{saritacs.rsh,parminder.dcse}@gndu.ac.in
2Department of Computer Engineering and Technology, Guru Nanak Dev University, Amritsar,
Punjab, India
satinder.dcet@gndu.ac.in
Abstract. Websites constitute a vital role in the development of the technologi-
cal world. The utilization of online platforms is increasing rapidly in healthcare
and e-learning sites. So, it becomes necessary to appraise the accessibility and
performance of these sites to ensure universal access and practical use. This paper
intends to inspect ten websites from two different domains in which the first five
are healthcare sites, whereas others are from e-learning sites. A tool named TAW
is used for accessibility evaluation in terms of perceivable, operable, understand-
able, and robust against the conformance of WCAG 2.0 level AA. In addition,
GTMetrix is utilized to test the performance in terms of page speed metrics. The
results show that both domain sites cannot fulfill the minimum requirements of
accessibility. However, E-learning sites performed better than healthcare sites con-
cerning performance perspective. Moreover, some suggestions are recommended
to improve the lacking factors in accessibility and performance of the sites. Fur-
thermore, this work can be extended by employing an intelligent accessibility tool
based on AI-powered methodology like machine learning technologies.
Keywords: Accessibility ·Performance ·Healthcare ·e-Learning ·TAW ·
GTMetrix ·WCAG 2.0
1 Introduction
Online education is the most imperative constituent in today’s competitive life of stu-
dents, and they prefer to use e-learning through computers, multimedia, and internet
technologies [5]. Besides, online medical platforms have also become the most signif-
icant part of our daily life. Due to busy schedules, people feel more comfortable going
online for the consultants to any health-related problem [17]. Hence, this is the primary
requirement to make e-learning and healthcare sites fully accessible and performance
effective. Web Accessibility is the primary concern with universal access of the web to all
kinds of people regardless of their disability [11]. To achieve the target of accessibility,
a few standards or guidelines like Web Content Accessibility Guidelines: WCAG 1.0,
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022
B. Iyer et al. (Eds.): ICCET 2022, SIST 303, pp. 172–187, 2022.
https://doi.org/10.1007/978-981-19-2719-5_16
Accessibility and Performance Evaluation of Healthcare and E-Learning Sites 173
WCAG 2.0, and WCAG 2.1 have been framed by World Wide Web Consortium (W3C)
[26,28,29]. The first version, WCAG 1.0, has been outlined based on the priority levels,
checkpoints, and conformance levels, while the higher versions WCAG 2.0 and WCAG
2.1 have been distributed with principles, success criteria, checkpoints, and conformance
levels [28,29].
Moreover, performance is also one of the essential requirements for site evaluation
[18]. It can be measured in First Contentful Paint, Speed Index, Total Blocking Time,
etc. This paper compares the top five e-learning sites with India’s top five healthcare sites
concerning accessibility and performance. The remainder of the paper is structured as
follows: Sect. 2illuminates related studies on web accessibility and performance assess-
ment and limitations. It also gives an overlook of the essential objectives of the study.
Section 3depicts the methodology adopted for evaluation. Section 4expresses the eval-
uation results and highlights some significant issues and suggestions to improve acces-
sibility and performance. Finally, Sect. 5concludes the paper depending on significant
findings and provides recommendations for further work.
2 Literature Review
Websites’ universal accessibility and effective performance are a significant concern [11,
13,18]. Table 1provides an overlook of previous studies on automated evaluation of
websites from different domains. Gilberto et al. [21] conducted a comparative study of
Spanish, American, and British healthcare sites, and after comparison, it was found that
not all sites passed the basic requirements of accessibility guidelines. Kaur et al. [17]
carried out a study on the 280 healthcare websites in the metro cities of India, and various
accessibility problems were found during the evaluation. Additionally, Salarvand et al.
[22] recorded a very low-quality score for the public healthcare sites in Tehran.
Furthermore, Acosta et al. [1] investigated the accessibility of healthcare websites
and found that the sites were unable to fulfill the accessibility requirements. Kumar
et al. [20] carried out a study on e-learning accessibility, and it was observed that the
outcomes were not so effective. Further, Akgul conducted a study on MOOC’s websites
to evaluate the accessibility using automated testing. Although, Seale and Jane [24]
carried out a study on e-learning and disability. Furthermore, Ismail and Kuppusami
[11] conducted an exploratory study on the accessibility of university homepages. The
results were disclosed with various issues related to accessibility, speed, navigability,
contents during the assessment. In recent studies, Akgul [4],AlMerajetal.[6], Barricelli
et al. [7] investigated the accessibility of educational websites. The results revealed that
the websites were not created per the rulesets specified for accessibility. Adepoju et al. [2]
used TAW to assess the accessibility of four Nigerian MNO’s websites. The outcomes
indicated that no site fulfilled the WCAG 2.0 guidelines completely. However, Dani
and Kaur [16] used TAW against Indian banking websites, while Agrawal et al. [3]
utilized TAW against Indian Airline Websites. According to an investigation by Inal and
Karaim [15], all the websites failed to fulfill the conformance level of WCAG during
the accessibility evaluation of Libyan government websites when evaluated by TAW. In
addition, the authors [2,25] emphasized the use of TAW for accessibility assessment.
However, some researchers used the GTMetrix tool to test the page performance of
174 K. Sarita et al.
websites as it provides a detailed performance evaluation report by covering each aspect
of page performance. Kaur et al. [19] used GTMetrix to evaluate different university
sites.
Moreover, Hungarian government websites were evaluated using the GTMetrix tool
[9]. Although, Ismail et al. [12,14] utilized the GTMetrix tool to evaluate the ency-
clopedia websites and social media sites. It is evident from the previous studies that
the websites were poorly designed and developed as these were failed to satisfy the
requirements of accessibility and performance factors completely.
2.1 Limitations of the Studies
Certain limitations are disclosed from the studies reviewed in the literature. The past
studies focused on the site evaluation of a single domain. So, the previous studies lacked
a comparison of websites belonging to different domains. As per the literature reviewed,
the number of studies conducted to evaluate healthcare and e-learning sites is meager.
The current study is not limited to a single domain, and it compares the sites from two
different domains.
2.2 Research Objectives
The study aims to evaluate and compare ten websites from two different domains. Overall
objectives are as follows:
•To explore the sites based on popularity using Alexa Rank.
•To evaluate the sites using automated testing.
•To find the current status of the sites in terms of accessibility and performance.
•To analyze and compare the results.
•To highlight the significant issues and some suggestions impacting the accessibility
and performance of the sites.
•To provide some future recommendations based on result findings.
3 Methodology
This paper expects to appraise and compare five e-learning and five healthcare sites.
These sites are picked by evaluating the Alexa traffic ranking in Table 1. The testing can
be started by entering the site URL into the respective tool’s address bar. The user gets
the assessment report when the testing process is over.
Accessibility and Performance Evaluation of Healthcare and E-Learning Sites 175
Table 1. Sites with Alexa rank
Sr. no. Healthcare sites Rank E-learning sites Rank
1https://wwww.practo.com 460 https://www.udemy.com 103
2http://www.medindia.net/ 929 https://www.w3schools.com 185
3https://www.healthkart.com/ 1868 https://www.coursera.org 261
4https://www.mohfw.gov.in/ 2216 https://www.geeksforgeeks.org 293
5https://www.onlymyhealth.com/ 6215 https://www.edx.org 962
3.1 Automated Tools
Several methodologies are used for site accessibility evaluation purposes like automated
evaluation, user, and expert or manual evaluation methods. The current study adopts the
methodology of automated testing compared to other methodologies to determine the
accessibility level of websites as it is easy to use and time-effective. Therefore, the two
automated tools are chosen for evaluation in the present study: TAW for accessibility
and GTMetrix for performance evaluation.
TAW: TAW is known as Test de Accessibilidad Web published by the Spanish Foun-
dation Centre for the Development of Information and Communication Technologies
in Asturias (CTIC) [10]. It can be found at https://www.tawdisnet/ [10]. It frames the
accessibility evaluation results in the form of problems, warnings, and not reviewed [10]
against the conformance of WCAG 2.1 (Web Content Accessibility Guideline), WCAG
2.0 (Web Content Accessibility Guideline), WCAG 1.0 (Web Content Accessibility
Guideline) [10].
•Problems: Such errors are easily detectable and resolvable by the automated tool itself
without any human assistance.
•Warnings: Warnings are challenging to resolve without human intervention.
•Not Reviewed: These types of errors are almost impossible to identify by automated
tools.
GTMetrix: GTMetrix is an analytical tool usable at https://www.gtmetrix.com [8]. This
tool is used for the performance evaluation of the websites. It provides the evaluation
results in performance grading, scoring, and page loading metrics [8]. In addition, it
grades the web pages to evaluate complete page performance and evaluate page loading
speed.
176 K. Sarita et al.
3.2 Accessibility and Performance Metrics
(See Table 2)
Table 2. Metrics for evaluation
Accessibility Metric Meaning
Perceivable The content, interface, and components should be findable by each
person [27]
Operable All controls, buttons, navigation, and other components must be usable
by all [27]
Understandable All users must understand, utilize and learn the interface [27]
Robust There should be an option for disabled people to select the technology to
communicate with the sites [27]
Performance Metric Meaning
First Contentful
Paint (FCP)
The time in which the text or an image gets painted on a web page [8]
Largest Contentful
Paint (LCP)
It is time for the most extensive content to appear on visitors’ viewports
[8]
Time To Interactive
(TTI)
This is the duration for which the webpage remained blocked [8]
Speed Index (SI) The time in which the page contents get visibly populated [8]
Total Blocking
Time (TBT)
This is the duration for which the webpage remained blocked [8]
Cumulative Layout
Shift (CLS)
It arises when a visible element on a page is shifted unexpectedly by
changing its size or position [8]
Method to Find the Level of Accessibility
A Five-Level Accessibility Criteria (FLAC), as in Table 3, is used to determine the
accessibility level of both domain sites [23]. A particular site is assigned the accessibility
level according to the range of error percentage as per FLAC.
Accessibility and Performance Evaluation of Healthcare and E-Learning Sites 177
Table 3. FLAC
Level of accessibility Error percentage
Maximal (0–10)%
High (11–30)%
Moderate (31–60)%
Low (61–90)%
Minimal (91–100)%
Method to Find Average Values of Performance Metrics
Average values of performance metrics are calculated by using the following methods:
Average FCP =(Sum of FCP of all sites)/(Total number of sites)
Average TTI =(Sum of TTI of all sites)/(Total number of sites)
Average SI =(Sum of SI of all sites)/(Total number of sites)
Average TBT =(Sum of TBT of all sites)/(Total number of sites)
Average LCP =(Sum of LCPs of all sites)/(Total number of sites)
Average CLS =(Sum of CLS of all sites)/(Total number of sites)
4 Results and Discussions
4.1 Accessibility Evaluation
Figure 1gives how the accessibility evaluation tool TAW generates the results. These
results are presented as problems, warnings and not reviewed. As per the results in Table
4, healthcare sites are found with the most significant violations in the case of perceiv-
ability, while the e-learning sites create the most significant number of errors concerning
robustness. On the other hand, the understandability factor is the most negligible violated
factor in both healthcare and e-learning sites, as shown in Table 5. Further, by continu-
ing with the criteria, the accessibility results are compiled as per FLAC [23] in Table 6.
According to FLAC, the three sites achieve the maximal level of accessibility, and two
sites are moderately accessible. However, all the five e-learning sites achieve only a high
level of accessibility, and no site can attain the maximal level of accessibility. Although,
no site from both domains falls under the low and minimal level of accessibility. Over-
all, healthcare sites are more violated as they generate many errors. TAW’s combined
accessibility results produced by TAW are expressed in Fig. 2and Fig. 3(Fig. 2).
178 K. Sarita et al.
Fig. 1. Screenshot of accessibility evaluation tool TAW
Table 4. Detailed results by TAW
Principle Error type Healthcare sites E-learning sites
Perceivable Problems 277 15
Warnings 953 243
Not reviewed 19 19
Total 1249 277
Operable Problems 154 23
Warnings 314 282
Not reviewed 31 37
Total 499 342
Understandable Problems 48 7
Warnings 66 48
Not reviewed 27 23
Total 141 78
Robust Problems 378 171
Warnings 77 208
Not reviewed 0 2
Total 455 381
Table 5. Most and least violated factors
Violation type Healthcare sites E-learning sites
Most violated Perceivable Robust
Least violated Understandable Understandable
Accessibility and Performance Evaluation of Healthcare and E-Learning Sites 179
Table 6. Accessibility results as per FLAC
Level of accessibility Healthcare sites E-learning sites
Maximal practo, mohfw, onlymyhealth None
High None udemy, w3schools, coursera,
geeksforgeeks, edx
Moderate Medindia, healthkart None
Low None None
Minimal None None
Fig. 2. Results by TAW
Fig. 3. POUR results
180 K. Sarita et al.
Fig. 4. Screenshot of performance evaluation tool GTMetrix
Significant Issues Impacting Site Accessibility
(See Table 7)
Table 7. Violated success criteria
Guideline Success criteria Level
1.1 Non-text content A
1.2 Audio-only and video-only (prerecorded) A
1.2 Captions (prerecorded) A
1.2 Audio description or media alternative (prerecorded) A
1.2 Audio description (prerecorded) A
1.3 Info and relationships A
1.3 Meaningful sequence A
1.3 Sensory characteristics A
1.4 Use of color A
1.4 Audio control A
1.4 Contrast (minimum) AA
1.4 Resize text AA
1.4 Images of text AA
2.1 Keyboard A
2.1 No keyboard trap A
2.2 Timing adjustable A
(continued)
Accessibility and Performance Evaluation of Healthcare and E-Learning Sites 181
Table 7. (continued)
Guideline Success criteria Level
2.2 Pause, stop, hide A
2.3 Three flashes or below threshold A
2.4 Bypass blocks A
2.4 Page titled A
2.4 Focus order A
2.4 Link purpose (in context) A
2.4 Multiple ways AA
2.4 Headings and labels AA
2.4 Focus visible AA
3.1 Language of page A
3.1 Language of parts AA
3.2 On focus A
3.2 On input A
3.2 Consistent navigation AA
3.2 Consistent identification AA
3.3 Error identification A
3.3 Labels or instructions A
3.3 Error suggestion AA
3.3 Error prevention (legal, financial, data) AA
4.1 Parsing A
4.1 Name, role, value A
Suggestions to Reduce Violations in Success Criteria
Some tips can help a developer reduce the most common success criteria violations.
•There should be the provision of an option for any image or time-based content for
visually impaired users.
•Content should be presentable in more than one way.
•Color should not be the only medium to interact with a visual element.
•There must be an alternative to stop the video that plays itself.
•All users must be able to perform all kinds of keyboard functionalities.
•The content should not be lost even after zooming more than 100%.
•It should be easy for the users to navigate from one page to other.
•There must be some assistance that helps the user prevent an error.
182 K. Sarita et al.
4.2 Performance Evaluation
While evaluating the performance by GTMetrix, some average values of six performance
metrics are calculated. Table 8demonstrates the comparative status of both domain sites
in terms of their average metric values. E-learning sites don’t exceed the normal range of
values and score very well. Besides, healthcare sites exceed the normal range and need
immediate improvement. The overall page performance is expressed in Table 9to grade
and scores in percentage. According to GTMetrix, e-learning sites performed better as
compared to healthcare sites. Figure 5,Fig.6, and Fig. 7present the overall results of
the performance evaluation generated by GTMetrix.
Table 8. Comparative status of average values of performance metrics
Sr. no. Metrics Normal value E-learning
sites
Status Healthcare
sites
Status
1FCP Less than or
equal to 0.9 s
0.6 s No action
needed
0.4 s No action
needed
2TTI Less than or
equal to 2.5 s
2.5 s No action
needed
3.2 s Action
needed
3SI Less than or
equal to 1.3 s
0.7 s No action
needed
2.1 s Action
needed
4TBT Less than or
equal to 0.1 s
0.1 s No action
needed
0.4 s Action
needed
5LCP Less than or
equal to 1.2 s
0.8 s No action
needed
1.2 s No action
needed
6CLS Less than or
equal to 0.1 s
0.05 s No action
needed
0.1 s No action
needed
Table 9. Performance grading and scoring
E-learning sites Performance
grade
Performance
score
Healthcare
sites
Performance
grade
Performance
score
Udemy B88% Practo A97%
W3schools A100% Medindia B95%
Courser C62% healthkart F31%
Geeksforgeeks A100% mohfw A 99%
Edx A93% onlymyhealth A92%
Accessibility and Performance Evaluation of Healthcare and E-Learning Sites 183
Fig. 5. E-learning sites results by GTMetrix
Fig. 6. Healthcare sites results by GTMetrix
184 K. Sarita et al.
Fig. 7. Combined results by GTMetrix
Major Issues Impacting Performance
(See Table 10)
Table 10. Performance issues
Impact level Type of issue
Med-High Large DOM size
Med-Low No use of Content Delivery Network (CDN)
Med-Low Use of HTTP/1
Low Improper size of images
Med-Low Use of document.write()
Low Outdated image formats
High Significant initial server response time, especially for the main document
Med-Low Use of CSS@imoport
Low Text Compression is disabled
Suggestions to Resolve Performance Issues
Following tips are suggested to improve and resolve performance-related issues:
•DOM size should be as small as possible to avoid memory wastage.
•Make use of Content Delivery Network (CDN).
•Prefer to use HTTP/2 over HTTP/1.
•Images should be sized properly.
•Avoid document. write() to reduce the page load delay.
Accessibility and Performance Evaluation of Healthcare and E-Learning Sites 185
•Use next-gen formats rather than png or jpeg for better compression and fast download.
In addition, the initial server response time should be short.
•Avoid using CSS@import to reduce additional delays in webpage load time.
•Enable text compression.
As per the analysis of past studies, the researchers evaluated only single-domain web-
sites simply by using an automated tool, whereas the current study focuses on evaluating
ten websites from two different domains. Furthermore, the current automated evaluation
methodology analyzes the results based on the Five-Level Accessibility Criteria (FLAC)
proposed in a recent study [23]. Therefore, unlikely from the previous findings, the most,
highly, moderately, least and not acceptable levels of sites accessibility can be evaluated
by present study. Although, another performance metric for evaluation is considered,
which is measured on average using the method is proposed. Moreover, no researcher
has taken both metrics for evaluation in a single study.
5 Conclusion and Future Recommendations
The current study has evaluated and compared the sites from two different domains
regarding accessibility and performance. The practical contributions of the study are
evaluating sites accessibility using TAW against the conformance of WCAG 2.0 level
AA and understanding the status of the sites by calculating the average values of per-
formance metrics. In addition, the current study contributes by evaluating two different
domains rather than a single domain and considering another performance metric for
evaluation in a single study. The outcome discoveries revealed that none of the websites
from both domains had zero accessibility error. Hence, the site makers are strictly recom-
mended to follow the accessibility guidelines and make more needful efforts to optimize
the user interfaces for impaired users. As per the result findings, the e-learning sites
performed better than healthcare sites when evaluated by GTMetrix. The current evalu-
ation considers only two domains that can be extended by including a more significant
number of domains for further evaluation in the future.
Moreover, this study provides some suggestions to improve the level of violations
related to accessibility and performance. Further, this work can be extended by employing
an intelligent accessibility tool based on the methodology of Artificial Intelligence like
machine learning techniques. Furthermore, it is also recommended to include other
methodologies such as user testing, expert testing, and automated testing to refine the
evaluation for more consistent and enhanced results.
References
1. Acosta-Vargas, P., Acosta, T., Lujan-Mora, S.: Framework for accessibility evaluation of hos-
pital websites. In: 2018 International Conference on edemocracy & eGovernment (ICEDEG),
pp. 9–15. IEEE (2018)
2. Adepoju, S.A., Ekundayo, A., Ojerinde, A.O., Ahmed, R.: Usability and accessibility evalu-
ation of Nigerian mobile network operators’ websites. In: 2019 2nd International Conference
of the IEEE Nigeria Computer Chapter (NigeriaComputConf), pp. 1–7. IEEE (2019)
186 K. Sarita et al.
3. Agrawal, G., Kumar, D., Singh, M., Dani, D.: Evaluating accessibility and usability of air-
line websites. In: Mayank Singh, P.K., Gupta, V.T., Flusser, J., Ören, T., Kashyap, R. (eds.)
ICACDS 2019. CCIS, vol. 1045, pp. 392–402. Springer, Singapore (2019). https://doi.org/
10.1007/978-981-13-9939-8_35
4. Akgül, Y.: Accessibility, usability, quality performance, and readability evaluation of univer-
sity websites of Turkey: a comparative study of state and private universities. Univ. Access
Inf. Soc. 20(1), 157–170 (2020)
5. Yakup, A.: Accessibility evaluation of MOOCS websites of Turkey. J. Life Econ. 5(4), 23–36
(2018). https://doi.org/10.15637/jlecon.259
6. AlMeraj, Z., Boujarwah, F., Alhuwail, D., Qadri, R.: Evaluating the accessibility of higher
education institution websites in the state of Kuwait: empirical evidence. Univ. Access Inf.
Soc. 20(1), 121–138 (2020). https://doi.org/10.1007/s10209-020-00717-8
7. Barricelli, B.R., Casiraghi, E., Dattolo, A., Rizzi, A.: 15 years of Stanca act: are Italian public
universities websites accessible? Univ. Access Inf. Soc. 20(1), 185–200 (2020). https://doi.
org/10.1007/s10209-020-00711-0
8. Carbon60: Page speed test tool. https://www.gtmetrix.com
9. Csontos, B., Heckl, I.: Accessibility, usability, and security evaluation of Hungarian govern-
ment websites. Univ. Access Inf. Soc. 20(1), 139–156 (2020). https://doi.org/10.1007/s10
209-020-00716-9
10. CTIC Taw web accessibility test. https://www.tawdisnet/
11. Ismail, A., Kuppusamy, K.: Accessibility of Indian universities’ homepages: an exploratory
study. J. King Saud Univ. Comput. Inf. Sci. 30(2), 268–278 (2018)
12. Ismail, N.A., Jamaluddin, F.I.,Hamidan, A.H., Ali, A.F., Mohamed, S.E., Said, C.S.: Usability
evaluation of encyclopedia websites. Int. J. Innov. Comput. 11(1), 21–25 (2021)
13. Jati, H., Dominic, D.D.: Website accessibility performance evaluation in Malaysia. In: 2008
International Symposium on Information Technology, vol. 1, pp. 1–3. IEEE (2008)
14. Jun, T.W., Xiang, L.Z., Ismail, N.A., Yi, W.G.R.: Usability evaluation of social media websites.
Int. Res. J. Mod. Eng. Technol. Sci. 3(1), 216–221 (2021)
15. Karaim, N.A., Inal, Y.: Usability and accessibility evaluation of Libyan government websites.
Univ. Access Inf. Soc. 18(1), 207–216 (2017). https://doi.org/10.1007/s10209-017-0575-3
16. Kaur, A., Dani, D.: Banking websites in India: an accessibility evaluation. CSI Trans. ICT
2(1), 23–34 (2014). https://doi.org/10.1007/s40012-014-0040-x
17. Kaur, A., Dani, D., Agrawal, G.: Evaluating the accessibility,usability and security of hospitals
websites: an exploratory study. In: 2017 7th International Conference on Cloud Computing,
Data Science & Engineering-Confluence, pp. 674–680. IEEE (2017)
18. Kaur, S., Gupta, S.K.: Key aspects to evaluate the performance of a commercial website. Int.
J. Comput. Appl. 1(1), 1–5 (2014)
19. Kaur, S., Kaur, K., Kaur, P.: An empirical performance evaluation of universities website. Int.
J. Comput. Appl. 146(15), 10–16 (2016)
20. Kumar, K.L., Owston, R.: Evaluating e-learning accessibility by automated and student-
centered methods. Educ. Tech. Res. Dev. 64(2), 263–283 (2015). https://doi.org/10.1007/s11
423-015-9413-6
21. Llinás, G., Rodríguez-Iñesta, D., Mira, J.J., Lorenzo, S., Aibar, C.: A comparison of websites
from Spanish, American and British hospitals. Meth. Inf. Med. 47(02), 124–130 (2008).
https://doi.org/10.3414/ME0474
22. Salarvand, S., Samadbeik, M., Tarrahi, M., Salarvand, H.: Quality of public hospitals websites:
a cross-sectional analytical study in iran. Acta Informatica Medica 24(2), 130 (2016). https://
doi.org/10.5455/aim.2016.24.130-133
Accessibility and Performance Evaluation of Healthcare and E-Learning Sites 187
23. Sarita, K., Kaur, P., Kaur, S.: Accessibility of healthcare sites: evaluation by automated tools.
In: Saraswat, M., Roy, S., Chowdhury, C., Gandomi, A.H. (eds.) Proceedings of Interna-
tional Conference on Data Science and Applications: ICDSA 2021, Volume 2, pp. 625–636.
Springer, Singapore (2022). https://doi.org/10.1007/978-981-16-5348-3_50
24. Seale, J.: E-learning and Disability in Higher Education: Accessibility Research and Practice.
Routledge (2013)
25. Verkijika, S.F., De Wet, L.: Accessibility of South African university websites. Univ. Access
Inf. Soc. 19(1), 201–210 (2020)
26. W3C WCAG 1.0. https://www.w3.org/tr/wcag10/guidelines
27. W3C Pour principles. https://webaim.org/standards/wcag/checklist
28. W3C WCAG 2.0. https://www.w3.org/tr/wcag20/guidelines
29. W3C WCAG2.1. https://www.w3.org/tr/wcag21/new-features-in-wcag-2-1