Concepedia

Publication | Open Access

Leveraging Sentence-level Information with Encoder LSTM for Semantic Slot Filling

100

Citations

36

References

2016

Year

Abstract

Recurrent Neural Network (RNN) and one of its specific architectures, Long Short-Term Memory (LSTM), have been widely used for sequence labeling. Explicitly modeling output label dependencies on top of RNN/LSTM is a widely-studied and effective extension. We propose another extension to incorporate the global information spanning over the whole input sequence. The proposed method, encoder-labeler LSTM, first encodes the whole input sequence into a fixed length vector with the encoder LSTM, and then uses this encoded vector as the initial state of another LSTM for sequence labeling. With this method, we can predict the label sequence while taking the whole input sequence information into consideration. In the experiments of a slot filling task, which is an essential component of natural language understanding, with using the standard ATIS corpus, we achieved the state-of-the-art F 1 -score of 95.66%.

References

YearCitations

Page 1