Adv. Artif. Intell. Mach. Learn., 3 (2):1122-1134
Wenping Wang : Individual Researcher
Article History: Received on: 10-May-23, Accepted on: 13-Jun-23, Published on: 18-Jun-23
Corresponding Author: Wenping Wang
Citation: Xuanyue Yang, Wenting Ye , Tianwei Yue, Wenping Wang (2023). Linguistically-Inspired Neural Coreference Resolution. Adv. Artif. Intell. Mach. Learn., 3 (2 ):1122-1134
The field of coreference resolution has witnessed significant advancements since the introduction of deep learning-based models. In this paper, we replicate the state-of-the-art coreference resolution model and perform a thorough error analysis. We identify a potential limitation of the current approach in terms of its treatment of grammatical constructions within sentences. Furthermore, the model struggles to leverage contextual information across sentences, resulting in suboptimal accuracy when resolving mentions that span multiple sentences. Motivated by these observations, we propose an approach that integrates linguistic information throughout the entire architecture. Our innovative contributions include multitask learning with part-of-speech (POS) tagging, supervision of intermediate scores, and self-attention mechanisms that operate across sentences. By incorporating these linguistic-inspired modules, we not only achieve a modest improvement in the F1 score on CoNLL 2012 dataset, but we also perform qualitative analysis to ascertain whether our model invisibly surpasses the baseline performance. Our findings demonstrate that our model successfully learns linguistic signals that are absent in the original baseline. We posit that these enhancements may have gone undetected due to annotation errors, but they nonetheless lead to a more accurate understanding of coreference resolution.