利用監督式對比學習來建構增強型的自迴歸文件檢索器

Translated title of the contribution: Building an Enhanced Autoregressive Document Retriever Leveraging Supervised Contrastive Learning

Yi Cheng Wang, Tzu Ting Yang, Hsin Wei Wang, Yung Chang Hsu, Berlin Chen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

The goal of an information retrieval system is to retrieve documents that are most relevant to a given user query from a huge collection of documents, which usually requires time-consuming multiple comparisons between the query and candidate documents so as to find the most relevant ones. Recently, a novel retrieval modeling approach, dubbed Differentiable Search Index (DSI), has been proposed. DSI dramatically simplifies the whole retrieval process by encoding all information about the document collection into the parameter space of a single Transformer model, on top of which DSI can in turn generate the relevant document identities (IDs) in an autoregressive manner in response to a user query. Although DSI addresses the shortcomings of traditional retrieval systems, previous studies have pointed out that DSI might fail to retrieve relevant documents because DSI uses the document IDs as the pivotal mechanism to establish the relationship between queries and documents, whereas not every document in the document collection has its corresponding relevant and irrelevant queries for the training purpose. In view of this, we put forward to leveraging supervised contrastive learning to better render the relationship between queries and documents in the latent semantic space. Furthermore, an approximate nearest neighbor search strategy is employed at retrieval time to further assist the Transformer model in generating document IDs relevant to a posed query more efficiently. A series of experiments conducted on the Nature Question benchmark dataset confirm the effectiveness and practical feasibility of our approach in relation to some strong baseline systems.

Translated title of the contributionBuilding an Enhanced Autoregressive Document Retriever Leveraging Supervised Contrastive Learning
Original languageChinese (Traditional)
Title of host publicationROCLING 2022 - Proceedings of the 34th Conference on Computational Linguistics and Speech Processing
EditorsYung-Chun Chang, Yi-Chin Huang, Jheng-Long Wu, Ming-Hsiang Su, Hen-Hsen Huang, Yi-Fen Liu, Lung-Hao Lee, Chin-Hung Chou, Yuan-Fu Liao
PublisherThe Association for Computational Linguistics and Chinese Language Processing (ACLCLP)
Pages273-282
Number of pages10
ISBN (Electronic)9789869576956
Publication statusPublished - 2022
Event34th Conference on Computational Linguistics and Speech Processing, ROCLING 2022 - Taipei, Taiwan
Duration: 2022 Nov 212022 Nov 22

Publication series

NameROCLING 2022 - Proceedings of the 34th Conference on Computational Linguistics and Speech Processing

Conference

Conference34th Conference on Computational Linguistics and Speech Processing, ROCLING 2022
Country/TerritoryTaiwan
CityTaipei
Period2022/11/212022/11/22

ASJC Scopus subject areas

  • Language and Linguistics
  • Speech and Hearing

Fingerprint

Dive into the research topics of 'Building an Enhanced Autoregressive Document Retriever Leveraging Supervised Contrastive Learning'. Together they form a unique fingerprint.

Cite this