RESC: Relation Extraction by Sequence Compression
Abstract
The extraction of entities and their interrelationships constitutes a fundamental technique for the compressed representation of knowledge in graph form. This paper introduces an architecture designed for the extraction of relational triples through sequence transformation and compression. The proposed model is constructed upon a BERT-like encoder and incorporates two specialized decision heads. By employing a novel sequence compression algorithm, each head is tasked with a simplified classification problem. The RESC architecture operates without autoregression and is not susceptible to issues arising from overlapping relations, a limitation common in many prior approaches. Consequently, the method achieves high performance, including an F1-score exceeding 0.88 on the NYT11 benchmark, a primary standard for relational triple extraction.
Full Text:
PDFReferences
Kuzmenko A.V., Kireev. V. S. Classification of methods for extracting relational triples from natural language texts. Proceedings of the XXV international scientific and technical conference Neuroinformatics-2023, 2023, pp. 302-311. (in Russian)
Shang, et al. Relational Triple Extraction: One Step is Enough. Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence (IJCAI-22). 2022, pp. 4360-4366.
Zenelnko, D., Aone, C., Richardella, A., et al. Kernel methods for relation extraction. Journal of Machine Learning Research. 2003. Vol. 3, pp 1083-1106.
Chan, S., Y., and Roth, D. Exploiting syntactico-semantic structures for relation extraction. Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. 2011. pp 551-560.
Zhong, Z., and Chen, D., A. A frustratingly easy approach for entity and relation extraction. Proceedings of the 2021 Conference of the North American Chaper of the Association for Computational Linguistics: Human Language Technologies. 2021. DOI: 10.18653/v1/2021.naacl-main-5.
Miwa, M., and Sasaki, Y. Modeling Joint Entity and Relation Extraction with Table Representation. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2014, pp 1858-1869.
Gupta, P., Schutze, H. and Andrassy, B. A Table Filling Multi-Task Recurrent Neural Network for Joint Entity and Relation Extraction. Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers. 2016, pp. 2537-2547.
Zhang, M., Zhang, Y., Fu, G. End-to-End Neural Relation Extraction with Global Optimization. Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 2017, pp. 1730-1740.
Zheng, S., et al. Joint Extraction of Entities and Relations Based on a Novel Tagging Scheme. Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. 2017, vol. 1: Long Papers, pp. 1227-1236.
Dai, D., et al. Joint Extraction of Entities and Overlapping Relations Using Position-Attentive Sequence Labeling. Proceedings of the AAAI Conference on Artificial Intelligence. 2019, pp. 6300-6308.
Wei, Z., et al. A Novel Cascade Binary Tagging Framework for Relational Triple Extraction. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020, pp. 1476-1488.
Zeng D., Zhang, H. and Liu, Q. CopyMTL: Copy Mechanism for Joint Extraction of Entities and Relations with Multi-Task Learning. Proceedings of the AAAI Conference on Artificial Intelligence. 2020, pp. 9507-9514.
Sui, D., Zeng, X., et al. Joint entity and relation extraction with set prediction networks. IEEE Transactions on Neural Networks and Learning Systems. 2023, DOI: 10.1109/TNNLS.2023.3264735.
Wen, X., et al. End-to-end entity detection with proposer and regressor. Neural Proceedings Letters. 2023, doi: 10.1007/s111063-023-11201-8.
Hu, Y., Ameer, I. and Zou, X. Zero-shot Clinical Entity Recognition using ChatGPT. 2024, DOI: arxiv-2303.16416.
Yuan, C., Xie, Q. and Ananiadou, S. Zero-shot Temporal Relation Extraction with CharGPT. 2023, DOI: 10.48550/arxiv.2304.05454.
Feng, P., Wu, H. and Yang, Z. Leveraging Prompt and Top-K Predictions with ChatGPT Data Augmentation for Improved Relation Extraction. Applied Sciences (ISSN 2076-3417). 2023, Vol 13(23).
Dagdelen, J., Dunn, A., Lee, S., et al. Structured information extraction from scientific text with large language models. Nature communications. 2024. Vol. 15, No 1418.
Liang Xu, Changxia Gao, Xuetao Tian. Domain-control prompt-driven zero-shot relational triplet extraction. Neurocomputing. 2024 Vol. 574. Doi: 10.1016/j.neucom.2024.127270.
Junnan Li, Dongxu Li, Silvio Savarese, Steven Hoi. BLIP-2: bootstrapping language-image Pre-Training with frozen image encoders and large language models. Proceedings of the 40th International Conferece on Machine Learning, Honolulu, Hawaii, USA. PLMR 202, 2023.
Hang Zhao, Yifei Xin, Zhesong Yu, et al. SLIT: boosting audio-text pre-training via multi-stage learning and instruction tunning. Arxiv:2402.07485v2.
Jacob Delvin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova. BERT: pre-training of deep bidirectional transformer for language understanding. Arxiv:1810.04805.
Sebastian Riedal, Limin Yao, Andrew McCallum. Modeling relations and their mentions without labeled text. In Joint Euripean Conference on Machine Learning and Knowledge Discovery in Databases, pp. 148-163.
Extracting relational facts by an end-to-end neural model with copy mechanism. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. vo1. 1, pp 506-514.
Ilya Loshchilov, Frank Hutter. Decoupled weight decay regularization. Published as a conference paper at ICLR 2019. Arxiv:1711.05101v3.
Tsu-Jui Fu, Peng-Hsuan Li, and Wei-Yun Ma. GraphRel: Modeling text as relational graphs for joint entity and relation extraction. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 1409–1418, Florence, Italy. Association for Computational Linguistics.
Xiangrong Zeng, Shizhu He, Daojian Zeng, Kang Liu, Jun Zhao. Learning the extraction order of multiple relational facts in a sentence with reinforcement learning. Proceedings of the 2019 Conference on Empirical Methods in Natural Lanugage Processing and the 9th International Joint Conference on Natural Language Processing, pp. 367-377.
Yucheng Wang, Bowen Yu, Yueyang Zhang, Tingwen Liu, Hongsong Zhu, , Limin Sun. TPLinker: Single-stage Joint Extraction of Entities and Relations Through Token Pair Linking. Proceedings of the 28th International Conference on Computational Linguistics, pages 1572–1582 Barcelona, Spain (Online), December 8-13, 2020.
Hengyi Zheng, Rui Wen, Xi Chen et al. PRGC: potential relation and global correspondence based joint relational triple extraction. Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, pp. 6225–6235.
Riccardo Orlando, Pere-Lluís Huguet Cabot, Edoardo Barba, Roberto Navigli. ReLiK: Retrieve and LinK, Fast and Accurate Entity Linking and Relation Extraction on an Academic Budget. Proceedings of the Associtation for Computational Linguistics ACL 2024. Arxiv:2408.00103.
Refbacks
- There are currently no refbacks.
Abava Кибербезопасность Monetec 2026 СНЭ
ISSN: 2307-8162