Publication | Open Access
Sentence-State LSTM for Text Representation
226
Citations
44
References
2018
Year
Unknown Venue
EngineeringMachine LearningMultilingual PretrainingRecurrent Neural NetworkBilstm ModelsBi-directional LstmsText MiningAlternative Lstm StructureNatural Language ProcessingSpeech RecognitionData ScienceComputational LinguisticsLanguage StudiesMachine TranslationSequence ModellingNlp TaskNeural Machine TranslationSentence-state LstmLinguistics
Bi-directional LSTMs are a powerful tool for text representation. On the other hand, they have been shown to suffer various limitations due to their sequential nature. We investigate an alternative LSTM structure for encoding text, which consists of a parallel state for each word. Recurrent steps are used to perform local and global information exchange between words simultaneously, rather than incremental reading of a sequence of words. Results on various classification and sequence labelling benchmarks show that the proposed model has strong representation power, giving highly competitive performances compared to stacked BiLSTM models with similar parameter numbers.
| Year | Citations | |
|---|---|---|
Page 1
Page 1