Abstract
Document-level relation extraction (DocRE) is a task that focuses on identifying relations between entities within a document. However, existing DocRE models often overlook the correlation between relations and lack a quantitative analysis of relation correlations. To address this limitation and effectively capture relation correlations in DocRE, we propose a relation graph method, which aims to explicitly exploit the interdependency among relations. Firstly, we construct a relation graph that models relation correlations using statistical co-occurrence information derived from prior relation knowledge. Secondly, we employ a re-weighting scheme to create an effective relation correlation matrix to guide the propagation of relation information. Furthermore, we leverage graph attention networks to aggregate relation embeddings. Importantly, our method can be seamlessly integrated as a plug-and-play module into existing models. Experimental results demonstrate that our approach can enhance the performance of multi-relation extraction, highlighting the effectiveness of considering relation correlations in DocRE.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Baruch, E.B., et al.: Asymmetric loss for multi-label classification. CoRR (2020)
Cai, R., Zhang, X., Wang, H.: Bidirectional recurrent convolutional neural network for relation classification. In: Proceedings of ACL (2016)
Che, X., Chen, D., Mi, J.: Label correlation in multi-label classification using local attribute reductions with fuzzy rough sets. In: FSS (2022)
Chen, M., Lan, G., Du, F., Lobanov, V.S.: Joint learning with pre-trained transformer on named entity recognition and relation extraction tasks for clinical analytics. In: ClinicalNLP@EMNLP 2020, Online, November 19, 2020 (2020)
Chen, Z., Wei, X., Wang, P., Guo, Y.: Multi-label image recognition with graph convolutional networks. In: CVPR (2019)
Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of ACL (2019)
Feng, J., Huang, M., Zhao, L., Yang, Y., Zhu, X.: Reinforcement learning for relation classification from noisy data. In: Proceedings of AAAI (2018)
Guo, Z., Zhang, Y., Lu, W.: Attention guided graph convolutional networks for relation extraction. In: Proceedings of ACL (2019)
He, H., Balakrishnan, A., Eric, M., Liang, P.: Learning symmetric collaborative dialogue agents with dynamic knowledge graph embeddings. In: Proceedings of ACL (2017)
Hendrickx, I., et al.: Semeval-2010 task 8. In: SEW@NAACL-HLT 2009, Boulder, CO, USA, June 4, 2009 (2009)
Hixon, B., Clark, P., Hajishirzi, H.: Learning knowledge graphs for question answering through conversational dialog. In: ACL (2015)
Li, B., Ye, W., Sheng, Z., Xie, R., Xi, X., Zhang, S.: Graph enhanced dual attention network for document-level relation extraction. In: Proceedings of COLING (2020)
Li, J., et al.: Biocreative V CDR task corpus: a resource for chemical disease relation extraction. Database J. Biol. Databases Curation 2016 (2016)
Li, J., Xu, K., Li, F., Fei, H., Ren, Y., Ji, D.: MRN: a locally and globally mention-based reasoning network for document-level relation extraction. In: Proceedings of ACL (2021)
Liu, Y., et al.: Roberta: a robustly optimized BERT pretraining approach. CoRR (2019)
Nan, G., Guo, Z., Sekulic, I., Lu, W.: Reasoning with latent structure refinement for document-level relation extraction. In: ACL (2020)
Peng, N., Poon, H., Quirk, C., Toutanova, K., Yih, W.: Cross-sentence N-ary relation extraction with graph LSTMs. In: TACL (2017)
Pennington, J., Socher, R., Manning, C.D.: Glove: Global vectors for word representation. In: EMNLP, pp. 1532–1543. ACL (2014)
dos Santos, C.N., Xiang, B., Zhou, B.: Classifying relations by ranking with convolutional neural networks. In: ACL (2015)
Tang, H., et al.: HIN: hierarchical inference network for document-level relation extraction. In: Proceedings of KDD (2020)
Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., Bengio, Y.: Graph attention networks. In: ICLR (2018)
Wang, H., Focke, C., Sylvester, R., Mishra, N., Wang, W.Y.: Fine-tune BERT for docred with two-step process. CoRR (2019)
Wang, L., Cao, Z., de Melo, G., Liu, Z.: Relation classification via multi-level attention CNNs. In: Proceedings of ACL (2016)
Wang, Y., et al.: Multi-label classification with label graph superimposing. In: Proceedings of AAAI (2020)
Wu, Y., Luo, R., Leung, H.C.M., Ting, H., Lam, T.W.: RENET: a deep learning approach for extracting gene-disease associations from literature. In: RECOMB 2019, Washington, DC, USA, May 5–8, 2019, Proceedings (2019)
Xiao, Y., Tan, C., Fan, Z., Xu, Q., Zhu, W.: Joint entity and relation extraction with a hybrid transformer and reinforcement learning based model. In: Proceedings of AAAI (2020)
Yao, Y., et al.: Docred: a large-scale document-level relation extraction dataset. In: ACL (2019)
Ye, D., et al.: Coreferential reasoning learning for language representation. In: Proceedings of EMNLP (2020)
Ye, Z., Ling, Z.: Distant supervision relation extraction with intra-bag and inter-bag attentions. In: Proceedings of ACL (2019)
Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: Proceedings of COLING (2014)
Zeng, S., Xu, R., Chang, B., Li, L.: Double graph based reasoning for document-level relation extraction. In: EMNLP (2020)
Zhang, N., et al.: Document-level relation extraction as semantic segmentation. In: Proceedings of IJCAI (2021)
Zhang, Y., Zhong, V., Chen, D., Angeli, G., Manning, C.D.: Position-aware attention and supervised data improve slot filling. In: Proceedings of EMNLP (2017)
Zhang, Z., et al.: Document-level relation extraction with dual-tier heterogeneous graph. In: Proceedings of COLING (2020)
Zhou, H., Xu, Y., Yao, W., Liu, Z., Lang, C., Jiang, H.: Global context-enhanced graph convolutional networks for document-level relation extraction. In: COLING (2020)
Zhou, J., et al.: Graph neural networks: a review of methods and applications. AI Open (2020)
Zhou, P., et al.: Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of ACL (2016)
Zhou, W., Huang, K., Ma, T., Huang, J.: Document-level relation extraction with adaptive thresholding and localized context pooling. In: Proceedings of AAAI (2021)
Zhu, H., Lin, Y., Liu, Z., Fu, J., Chua, T., Sun, M.: Graph neural networks with generated parameters for relation extraction. In: Proceedings of ACL (2019)
Acknowledgements
The authors would like to thank the support from the National Natural Science Foundation of China (NSFC) grant (No. 62106143), and Shanghai Pujiang Program (No. 21PJ1405700).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Huang, Y., Lin, Z. (2024). Document-Level Relation Extraction with Relation Correlation Enhancement. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Communications in Computer and Information Science, vol 1967. Springer, Singapore. https://doi.org/10.1007/978-981-99-8178-6_33
Download citation
DOI: https://doi.org/10.1007/978-981-99-8178-6_33
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-8177-9
Online ISBN: 978-981-99-8178-6
eBook Packages: Computer ScienceComputer Science (R0)