Figure - uploaded by Kuldeep Singh
Content may be subject to copyright.
Comparison on AIDA-B. Best value in bold and previous SOTA value is underlined.

Comparison on AIDA-B. Best value in bold and previous SOTA value is underlined.

Source publication
Preprint
Full-text available
In this paper, we propose CHOLAN, a modular approach to target end-to-end entity linking (EL) over knowledge bases. CHOLAN consists of a pipeline of two transformer-based models integrated sequentially to accomplish the EL task. The first transformer model identifies surface forms (entity mentions) in a given text. For each mention, a second transf...

Contexts in source publication

Context 1
... observe that the performance drops when the local sentential context is not fed (cf. Table 4 reports the performance of CHOLAN's configurations on AIDA-B test set. The first configuration is "CHOLAN-Wiki+ FC" in which MD and ED models are trained using CoNLL-AIDA. ...
Context 2
... observe that the performance drops when the local sentential context is not fed (cf. Table 4 reports the performance of CHOLAN's configurations on AIDA-B test set. The first configuration is "CHOLAN-Wiki+ FC" in which MD and ED models are trained using CoNLL-AIDA. ...