Publication | Open Access
Sequence to Backward and Forward Sequences: A Content-Introducing\n Approach to Generative Short-Text Conversation
191
Citations
21
References
2016
Year
Using neural networks to generate replies in human-computer dialogue systems\nis attracting increasing attention over the past few years. However, the\nperformance is not satisfactory: the neural network tends to generate safe,\nuniversally relevant replies which carry little meaning. In this paper, we\npropose a content-introducing approach to neural network-based generative\ndialogue systems. We first use pointwise mutual information (PMI) to predict a\nnoun as a keyword, reflecting the main gist of the reply. We then propose\nseq2BF, a "sequence to backward and forward sequences" model, which generates a\nreply containing the given keyword. Experimental results show that our approach\nsignificantly outperforms traditional sequence-to-sequence models in terms of\nhuman evaluation and the entropy measure, and that the predicted keyword can\nappear at an appropriate position in the reply.\n
| Year | Citations | |
|---|---|---|
Page 1
Page 1