Improved speech summarization with multiple-hypothesis representations and Kullback-Leibler divergence measures

Shih Hsiang Lin*, Berlin Chen

*此作品的通信作者

研究成果: 雜誌貢獻會議論文同行評審

23 引文 斯高帕斯(Scopus)

摘要

Imperfect speech recognition often leads to degraded performance when leveraging existing text-based methods for speech summarization. To alleviate this problem, this paper investigates various ways to robustly represent the recognition hypotheses of spoken documents beyond the top scoring ones. Moreover, a new summarization method stemming from the Kullback-Leibler (KL) divergence measure and exploring both the sentence and document relevance information is proposed to work with such robust representations. Experiments on broadcast news speech summarization seem to demonstrate the utility of the presented approaches.

ASJC Scopus subject areas

  • 人機介面
  • 訊號處理
  • 軟體
  • 感覺系統

指紋

深入研究「Improved speech summarization with multiple-hypothesis representations and Kullback-Leibler divergence measures」主題。共同形成了獨特的指紋。

引用此