Concepedia

TLDR

End‑to‑end neural dialogue models often generate uninformative responses, and while prior work has added external knowledge, few have addressed how to select the appropriate knowledge, which can impede learning. We propose an end‑to‑end neural model that employs a novel knowledge‑selection mechanism using both prior and posterior distributions to guide knowledge choice. The model infers a posterior over knowledge from utterances and responses to ensure correct selection during training, and a prior from utterances alone to approximate the posterior for inference without responses. Experiments with automatic and human evaluation show that our model better incorporates appropriate knowledge and outperforms previous baselines.

Abstract

End-to-end neural models for intelligent dialogue systems suffer from the problem of generating uninformative responses. Various methods were proposed to generate more informative responses by leveraging external knowledge. However, few previous work has focused on selecting appropriate knowledge in the learning process. The inappropriate selection of knowledge could prohibit the model from learning to make full use of the knowledge. Motivated by this, we propose an end-to-end neural model which employs a novel knowledge selection mechanism where both prior and posterior distributions over knowledge are used to facilitate knowledge selection. Specifically, a posterior distribution over knowledge is inferred from both utterances and responses, and it ensures the appropriate selection of knowledge during the training process. Meanwhile, a prior distribution, which is inferred from utterances only, is used to approximate the posterior distribution so that appropriate knowledge can be selected even without responses during the inference process. Compared with the previous work, our model can better incorporate appropriate knowledge in response generation. Experiments on both automatic and human evaluation verify the superiority of our model over previous baselines.

References

YearCitations

2014

33.2K

2014

23.7K

2014

14.6K

1971

8.3K

2014

6.5K

2016

3.2K

2016

2K

2015

1.5K

2016

1.4K

2018

1.2K

Page 1