Content uploaded by Fatima Nadeem Al-Aswadi
Author content
All content in this area was uploaded by Fatima Nadeem Al-Aswadi on Oct 17, 2022
Content may be subject to copyright.
RELU DROPOUT DEEP BELIEF NETWORK FOR
ONTOLOGY SEMANTIC RELATION DISCOVERY
FATIMA N. AL-ASWADI, HUAH YONG CHAN, & KENG HOON GAN
Universiti Sains Malaysia, Pulau Pinang, Malaysia.
Hodeidah University, Hodeidah , Yemen.
PRESENTED BY: Fatima N. AL-Aswadi
OUTLINES
Background
Research Problem
Configuring Learning Strategy
Deep Belief Network (DBN)
Adjusted DBN (Relu Dropout DBN)
Experiments & Results
Conclusion
BACKGROUND
Identifying Concepts & Relations
Mapping
Concepts &
Relations
R
R
R
Relations Discovery is considered the Backbone task of any Ontology Learning
System
ONTOLOGY & ONTOLOGY LEARNING
Many applications are based on ontology
Semantic Searching
Decision-Support Systems
Automated Fraud Detection
Question-Answering Systems
IMPORTANCE OF ONTOLOGY
RELATIONS DISCOVERY
Relations are the taxonomic or non-taxonomic (semantic) connections between
extracted concepts that form the facts and rules of specific domain.
Relations discovery task aims to extract the relationships among selected concepts.
Relations discovery techniques are combination of Natural Language Processing
(NLP) and Machine Learning (ML).
RESEARCH PROBLEM
RESEARCH PROBLEM
Extracting only taxonomic relation (or very limited relations), or based on large
predefined patterns, which have very low recall result such as in (A. El-Kilany et
al.,2017;Gillani Andleeb, 2015; L. Bergelid, 2018;Sureshkumar & Zayaraz, 2015; J.
Zhang et al., 2016; J. Wang et al., 2018;Zhong et al., 2016).
Proposed solution:
Develop efficient automatic method includes dual stages :
•adjusting and configuring the appropriate learning
strategy to can automatically detect or learn relations
•developing relations extraction and classification
technique based on the established learning strategy
This paper contribution
Upcoming contribution
CONFIGURING LEARNING STRATEGY
DEEP BELIEF NETWORK (DBN)
•A stack of Restricted
Boltzmann machines
(RBMs)
•The learning algorithm of
DBN has dual stages:
pre-training and fine-
tuning
•RBM is a symmetrical graph
(each visible node is
connected with each hidden
node) and consists of two
layers (visible and hidden)
h0h1hm
v0v1v2vn
h
v
w
...
...
b
0
b
1
b
m
a
0
a
1
a
2
a
n
w
00
w
nm
•Each layer in the DBNs has a
double role, it serves as the
hidden layer to the nodes that
come before and as the visible
layer to the nodes that come
after.
Input Layer
Output Layer
h1
h2
h3
RBM1
RBM2
RBM3
before
after
DEEP BELIEF NETWORK (DBN) CONT.
DBN
Input Layer
Output Layer
h1
h2
h3
Input Layer
Output Layer
h1
h2
h3
Without Dropout With Dropout
Inactive unit
Active unit
()
=
+
=1
()=+
=1
=
+
=1 , if = 1
0 , other else = 0
Traditional Adjusted (Re-DDBN)
Sigmoid
=
+
Relu (Rectified Linear)
=,=,>
,
Re-DDBN HYPER-PARAMETERS
Hyper-parameter Value/Type
Hidden Layers 3
Neurons/nodes 800,600,500
Hidden Layers Activation Relu
Output Layers Activation Softmax
learning rate 0.2,0.3
Batch size Small dataset: 100~180
Large dataset: 180~270
epochs 50~80
Dropout 0.4
Optimization algorithm SGD
Hundreds of scenarios and
experiments have been
conducted to determine the
hyper-parameters’ values
EXPERIMENTS & RESULTS
DATASET
SemEval-2018 task 7 is a collection of two corpora, ACLRelAcS and ACL RD-TEC
2.0, within the computational linguistics domain. They are based on the ACL
Anthology Reference Corpus (a digital archive of published papers in both journals
and conferences for NLP and computational linguistics domain).
It is divided into two sizes; the small fraction had 800 samples and the large part
contained 1800 samples.
https://lipn.univ-paris13.fr/~gabor/semeval2018task7/
DROPOUT STRATEGY EXPERIMENTS
Different percentages of
dropout applied for Re-DDBN
and the outcomes revealed
that 40% dropout (=0.4) had the
best results for both small
and large sampling.
Accuracy of Re-DDBN for Different Dropout Percentages
COMPARATIVE MODELS
Traditional DBN: such as used in (H. Wang, 2015; Zhong et al., 2016)(Sigmoid +DBN)
Re-DBN: such as used in (Dai et al., 2017)(Relu +DBN)
Sig-DDBN: such as used in (J. Huang & Guan, 2021) (Sigmoid +Dropout +DBN)
Multinomial NB: same as used in (Sureshkumar & Zayaraz, 2015)
Linear SVM: same as used in (Bergelid, 2018)
Re-DDBN: Adjusted model (Relu +Dropout +DBN)
Hundreds of scenarios and experiments respectively were conducted to select the hyper-parameters for
these variations of DBN.
COMPARISON EXPERIMENT
The comparative experiments was conducted 10 times with 3-
fold cross-validation (=30) for each model to obtain the results.
The results were calculated using the macro function of k-fold
cross-validation.
COMPARISON EXPERIMENT
DL/ ML Model Hidden Layers
Activation With/without
Dropout Samples size Accuracy
DL
Re-DBN Relu without Small 0.4310±0. 0701
Re-DDBN Relu with Small 0.4620±0.0805
Traditional DBN Sigmoid without Small 0.4093±0.0690
Sig-DDBN Sigmoid with Small 0.4227±0.0498
Re-DBN Relu without Large 0.4575±0.1632
Re-DDBN Relu with Large 0.5244±0.1407
Traditional DBN Sigmoid without Large 0.4070±0.1305
Sig-DDBN Sigmoid with Large 0.4106±0.1026
ML
SVM Linear -- Small 0.4428±0.1379
SVM Linear -- Large 0.5058±0.0839
NB Multinomial -- Small 0.4563±0.0690
NB Multinomial -- Large 0.4868±0.0839
Comparison of the Developed Re-DDBN, Traditional DBN, Re-DBN, Sig-DDBN, SVM, and NB for Small
and Large Samples
RESULTS & DISCUSSION
Re-DDBN obtained the best accuracy result among the comparative models in small
and large samples.
The second highest accuracy was obtained by linear SVM for large samples and by
multinomial NB for small samples.
The sigmoid function showed weaker accuracy result than other models due to the
vanishing problem.
As for the imbalanced, noisy dataset and low characteristic dimensions of samples,
the adjusted model (Re-DDBN) performs better than other comparative models.
CONCLUSION
A Re-DDBN model, which uses a dropout strategy and Relu activation function, is proposed in this
study to establish the learning strategy for relations discovery.
The adjusted model, Re-DDBN, outperformed the traditional model DBN and other comparative
models.
The upcoming paper shall present the development of relations extraction and classification
techniques for OL based on the Re-DDBN model.
In future work, we will conduct the comparison of using Relu functions with and without batch
normalization with applying the dropout strategy or applying without it.
REFERENCES
1Franco, W., Avila, C.V.S., Oliveira, A., Maia, G., Brayner, A., Vidal, V.M.P., et al.: Ontology-based Question Answering Systems over Knowledge Bases: A Survey. 22nd International Conference on
Enterprise Information Systems (ICEIS) p. 532-9: SCITEPRESS Digital Library; (2020).
2. Al-Aswadi, F.N., Chan, H.Y., Gan, K.H.:Extracting Semantic Concepts and Relations from Scientific Publications by Using Deep Learning. In: Saeed F, Mohammed F, Al-Nahari A, editors. Innovative
Systems for Intelligent Health Informatics. IRICT2020 (p. 374-83). Cham: Springer International Publishing; (2021).
3. Ahmed, I.A., Al-Aswadi, F.N., Noaman, K.M.G., Alma'aitah, W.Z.a.: Arabic Knowledge Graph Construction: A close look in the present and into the future. Journal of King Saud University -
Computer and Information Sciences. (2022).
4. Tiwari, S., Al-Aswadi, F.N., Gaurav, D.: Recent trends in knowledge graphs: theory and practice. Soft Computing. (2021).
5. Wong, W., Liu, W., Bennamoun, M.: Ontology learning from text: A look back and into the future. ACM Computing Surveys (CSUR).44(4):20. (2012).
6. Al-Aswadi, F.N., Chan, H.Y., Gan, K.H.: Automatic ontology construction from text: a review from shallow to deep learning trend. Artificial Intelligence Review.53(6):3901-28. (2020).
7. Zhao, Z., Han, S.-K., So, I.-M.: Architecture of knowledge graph construction techniques. International Journal of Pure and Applied Mathematics.118(19):1869-83. (2018).
8. Alma’aitah, W.Z.a., Talib, A.Z., Osman, M.A.: Opportunities and challenges in enhancing access to metadata of cultural heritage collections: a survey. Artificial Intelligence Review.53(5):3621-46.
(2020).
9. Saber, Y.M., Abdel-Galil, H., El-Fatah Belal, M.A.: Arabic ontology extraction model from unstructured text. Journal of King Saud University -Computer and Information Sciences. (2022)
10. Sathiya, B., Geetha, T.V.: Automatic Ontology Learning from Multiple Knowledge Sources of Text. Int J Intell Inf Technol.14(2):1–21. (2018).
11. Arefyev, N., Sheludko, B., Davletov, A., Kharchev, D., Nevidomsky, A., Panchenko, A.: Neural GRANNy at SemEval-2019 task 2: A combined approach for better modeling of semantic relationships in
semantic frame induction. Proceedings of the 13th International Workshop on Semantic Evaluation. p. 31-8(2019).
12. Gillani Andleeb, S.: From text mining to knowledge mining: An integrated framework of concept extraction and categorization for domain ontology. Department of Information Systems. p. 146.
Budapest: Budapesti Corvinus Egyetem; (2015).
13. Sombatsrisomboon, R., Matsuo, Y., Ishizuka, M.: Acquisition of hypernyms and hyponyms from the WWW. Proceedings of the 2nd International Workshop on Active Mining(2003).
14. Specia, L., Motta, E.: A hybrid approach for relation extraction aimed at the semantic web. International Conference on Flexible Query Answering Systems. p. 564-76: Springer; (2006).
15. Sánchez, D., Moreno, A.: Learning non-taxonomic relationships from web documents for domain ontology construction. Data & Knowledge Engineering.64(3):600-23. (2008).
REFERENCES
16. El-Kilany, A., Tazi, N.E., Ezzat, E.:Building Relation Extraction Templates via Unsupervised Learning.In:Proceedings of the 21st International Database Engineering & Applications Symposium.228-34.
Bristol, United Kingdom ACM, (2017).
17. Minard, A.-L., Ligozat, A.-L., Grau, B.: Multi-class SVM for relation extraction from clinical reports. Recent Advances in Natural Language Processing. Hissar, Bulgaria(2011).
18. Bergelid, L.: Classification of explicit music content using lyrics and music metadata. TRITA-EECS-EX. STOCKHOLM, SWEDEN: KTH ROYAL INSTITUTE OF TECHNOLOGY; (2018).
19. Sureshkumar, G., Zayaraz, G.: Automatic relation extraction using naïve Bayes classifier for concept relational ontology development. International Journal of Computer Aided Engineering and
Technology.7(4):421-35. (2015).
20. Zhang, J., Liu, J., Wang, X.: Simultaneous Entities and Relationship Extraction from Unstructured Text. International Journal of Database Theory and Application.9(6):151-60. (2016).
21. Etzioni, O., Banko, M., Soderland, S., Weld, D.S.: Open information extraction from the web. Commun ACM.51(12):68-74. (2008).
22. Zhong, B., Liu, J., Du, Y., Liaozheng, Y., Pu, J.: Extracting Attributes of Named Entity from Unstructured Text with Deep Belief Network. International Journal of Database Theory and
Application.9(5):187-96. (2016).
23. Chen, Y., Li, W., Liu, Y., Zheng, D., Zhao, T.: Exploring deep belief network for chinese relation extraction. Proceedings of the Joint Conference on Chinese Language Processing (CLP’10). p. 28-
9(2010).
24. Feng, J., Lu, S.: Performance analysis of various activation functions in artificial neural networks. Journal of physics: conference series. p. 022030: IOP Publishing; (2019).
25. Dai, J., Song, H., Sheng, G., Jiang, X.: Dissolved gas analysis of insulating oil for power transformer fault diagnosis with deep belief network. IEEE Transactions on Dielectrics and Electrical
Insulation.24(5):2828-35. (2017).
26. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. The journal of machine learning research.15(1):1929-58.
(2014).
27. Nematzadeh, Z., Ibrahim, R., Selamat, A.: Comparative studies on breast cancer classifications with k-fold cross validations using machine learning techniques. 2015 10th Asian Control Conference
(ASCC). p. 1-6(2015).
28. Wang, H.: Semantic Deep Learning. University of Oregon. (2015).
29. Huang, J., Guan, Y.: Dropout Deep Belief Network Based Chinese Ancient Ceramic Non-Destructive Identification. Sensors.21(4):1318. (2021).
Fatima N.
AL-Aswadi
Huah
Yo n g
Chan
Keng
Hoon
Gan
Ph.D. candidate in School of
Computer Sciences at Universiti
Sains Malaysia (USM).Also,
research assistant and instructor
in the Computer Sciences
Department, at Hodeidah
University,Yemen.
Associate Professor in the
School of Computer
Sciences, Universiti Sains
Malaysia (USM).
Senior lecturer in the School
of Computer Sciences,
Universiti Sains Malaysia
(USM).
THANK YOU