Skip to main content

Document-Level Relation Extraction with Relation Correlation Enhancement

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2023)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1967))

Included in the following conference series:

Abstract

Document-level relation extraction (DocRE) is a task that focuses on identifying relations between entities within a document. However, existing DocRE models often overlook the correlation between relations and lack a quantitative analysis of relation correlations. To address this limitation and effectively capture relation correlations in DocRE, we propose a relation graph method, which aims to explicitly exploit the interdependency among relations. Firstly, we construct a relation graph that models relation correlations using statistical co-occurrence information derived from prior relation knowledge. Secondly, we employ a re-weighting scheme to create an effective relation correlation matrix to guide the propagation of relation information. Furthermore, we leverage graph attention networks to aggregate relation embeddings. Importantly, our method can be seamlessly integrated as a plug-and-play module into existing models. Experimental results demonstrate that our approach can enhance the performance of multi-relation extraction, highlighting the effectiveness of considering relation correlations in DocRE.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    We conduct no experiments on the CDR [13] and GDA [25] datasets in the biomedical domain, because they do not suffer the multi-relation issue. Therefore, they do not match our scenario.

References

  1. Baruch, E.B., et al.: Asymmetric loss for multi-label classification. CoRR (2020)

    Google Scholar 

  2. Cai, R., Zhang, X., Wang, H.: Bidirectional recurrent convolutional neural network for relation classification. In: Proceedings of ACL (2016)

    Google Scholar 

  3. Che, X., Chen, D., Mi, J.: Label correlation in multi-label classification using local attribute reductions with fuzzy rough sets. In: FSS (2022)

    Google Scholar 

  4. Chen, M., Lan, G., Du, F., Lobanov, V.S.: Joint learning with pre-trained transformer on named entity recognition and relation extraction tasks for clinical analytics. In: ClinicalNLP@EMNLP 2020, Online, November 19, 2020 (2020)

    Google Scholar 

  5. Chen, Z., Wei, X., Wang, P., Guo, Y.: Multi-label image recognition with graph convolutional networks. In: CVPR (2019)

    Google Scholar 

  6. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of ACL (2019)

    Google Scholar 

  7. Feng, J., Huang, M., Zhao, L., Yang, Y., Zhu, X.: Reinforcement learning for relation classification from noisy data. In: Proceedings of AAAI (2018)

    Google Scholar 

  8. Guo, Z., Zhang, Y., Lu, W.: Attention guided graph convolutional networks for relation extraction. In: Proceedings of ACL (2019)

    Google Scholar 

  9. He, H., Balakrishnan, A., Eric, M., Liang, P.: Learning symmetric collaborative dialogue agents with dynamic knowledge graph embeddings. In: Proceedings of ACL (2017)

    Google Scholar 

  10. Hendrickx, I., et al.: Semeval-2010 task 8. In: SEW@NAACL-HLT 2009, Boulder, CO, USA, June 4, 2009 (2009)

    Google Scholar 

  11. Hixon, B., Clark, P., Hajishirzi, H.: Learning knowledge graphs for question answering through conversational dialog. In: ACL (2015)

    Google Scholar 

  12. Li, B., Ye, W., Sheng, Z., Xie, R., Xi, X., Zhang, S.: Graph enhanced dual attention network for document-level relation extraction. In: Proceedings of COLING (2020)

    Google Scholar 

  13. Li, J., et al.: Biocreative V CDR task corpus: a resource for chemical disease relation extraction. Database J. Biol. Databases Curation 2016 (2016)

    Google Scholar 

  14. Li, J., Xu, K., Li, F., Fei, H., Ren, Y., Ji, D.: MRN: a locally and globally mention-based reasoning network for document-level relation extraction. In: Proceedings of ACL (2021)

    Google Scholar 

  15. Liu, Y., et al.: Roberta: a robustly optimized BERT pretraining approach. CoRR (2019)

    Google Scholar 

  16. Nan, G., Guo, Z., Sekulic, I., Lu, W.: Reasoning with latent structure refinement for document-level relation extraction. In: ACL (2020)

    Google Scholar 

  17. Peng, N., Poon, H., Quirk, C., Toutanova, K., Yih, W.: Cross-sentence N-ary relation extraction with graph LSTMs. In: TACL (2017)

    Google Scholar 

  18. Pennington, J., Socher, R., Manning, C.D.: Glove: Global vectors for word representation. In: EMNLP, pp. 1532–1543. ACL (2014)

    Google Scholar 

  19. dos Santos, C.N., Xiang, B., Zhou, B.: Classifying relations by ranking with convolutional neural networks. In: ACL (2015)

    Google Scholar 

  20. Tang, H., et al.: HIN: hierarchical inference network for document-level relation extraction. In: Proceedings of KDD (2020)

    Google Scholar 

  21. Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., Bengio, Y.: Graph attention networks. In: ICLR (2018)

    Google Scholar 

  22. Wang, H., Focke, C., Sylvester, R., Mishra, N., Wang, W.Y.: Fine-tune BERT for docred with two-step process. CoRR (2019)

    Google Scholar 

  23. Wang, L., Cao, Z., de Melo, G., Liu, Z.: Relation classification via multi-level attention CNNs. In: Proceedings of ACL (2016)

    Google Scholar 

  24. Wang, Y., et al.: Multi-label classification with label graph superimposing. In: Proceedings of AAAI (2020)

    Google Scholar 

  25. Wu, Y., Luo, R., Leung, H.C.M., Ting, H., Lam, T.W.: RENET: a deep learning approach for extracting gene-disease associations from literature. In: RECOMB 2019, Washington, DC, USA, May 5–8, 2019, Proceedings (2019)

    Google Scholar 

  26. Xiao, Y., Tan, C., Fan, Z., Xu, Q., Zhu, W.: Joint entity and relation extraction with a hybrid transformer and reinforcement learning based model. In: Proceedings of AAAI (2020)

    Google Scholar 

  27. Yao, Y., et al.: Docred: a large-scale document-level relation extraction dataset. In: ACL (2019)

    Google Scholar 

  28. Ye, D., et al.: Coreferential reasoning learning for language representation. In: Proceedings of EMNLP (2020)

    Google Scholar 

  29. Ye, Z., Ling, Z.: Distant supervision relation extraction with intra-bag and inter-bag attentions. In: Proceedings of ACL (2019)

    Google Scholar 

  30. Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: Proceedings of COLING (2014)

    Google Scholar 

  31. Zeng, S., Xu, R., Chang, B., Li, L.: Double graph based reasoning for document-level relation extraction. In: EMNLP (2020)

    Google Scholar 

  32. Zhang, N., et al.: Document-level relation extraction as semantic segmentation. In: Proceedings of IJCAI (2021)

    Google Scholar 

  33. Zhang, Y., Zhong, V., Chen, D., Angeli, G., Manning, C.D.: Position-aware attention and supervised data improve slot filling. In: Proceedings of EMNLP (2017)

    Google Scholar 

  34. Zhang, Z., et al.: Document-level relation extraction with dual-tier heterogeneous graph. In: Proceedings of COLING (2020)

    Google Scholar 

  35. Zhou, H., Xu, Y., Yao, W., Liu, Z., Lang, C., Jiang, H.: Global context-enhanced graph convolutional networks for document-level relation extraction. In: COLING (2020)

    Google Scholar 

  36. Zhou, J., et al.: Graph neural networks: a review of methods and applications. AI Open (2020)

    Google Scholar 

  37. Zhou, P., et al.: Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of ACL (2016)

    Google Scholar 

  38. Zhou, W., Huang, K., Ma, T., Huang, J.: Document-level relation extraction with adaptive thresholding and localized context pooling. In: Proceedings of AAAI (2021)

    Google Scholar 

  39. Zhu, H., Lin, Y., Liu, Z., Fu, J., Chua, T., Sun, M.: Graph neural networks with generated parameters for relation extraction. In: Proceedings of ACL (2019)

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank the support from the National Natural Science Foundation of China (NSFC) grant (No. 62106143), and Shanghai Pujiang Program (No. 21PJ1405700).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhouhan Lin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Huang, Y., Lin, Z. (2024). Document-Level Relation Extraction with Relation Correlation Enhancement. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Communications in Computer and Information Science, vol 1967. Springer, Singapore. https://doi.org/10.1007/978-981-99-8178-6_33

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8178-6_33

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8177-9

  • Online ISBN: 978-981-99-8178-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy