Publication | Open Access
Leveraging Knowledge Bases in LSTMs for Improving Machine Reading
253
Citations
43
References
2017
Year
Unknown Venue
Structured PredictionLlm Fine-tuningEngineeringMachine LearningCorpus LinguisticsText MiningNatural Language ProcessingData ScienceKnowledge BasesComputational LinguisticsNamed-entity RecognitionMachine TranslationSequence ModellingNlp TaskRetrieval Augmented GenerationMachine ReadingRelationship ExtractionEvent ExtractionExternal Knowledge Bases
This paper focuses on how to take advantage of external knowledge bases (KBs) to improve recurrent neural networks for machine reading. Traditional methods that exploit knowledge from KBs encode knowledge as discrete indicator features. Not only do these features generalize poorly, but they require task-specific feature engineering to achieve good performance. We propose KBLSTM, a novel neural model that leverages continuous representations of KBs to enhance the learning of recurrent neural networks for machine reading. To effectively integrate background knowledge with information from the currently processed text, our model employs an attention mechanism with a sentinel to adaptively decide whether to attend to background knowledge and which information from KBs is useful. Experimental results show that our model achieves accuracies that surpass the previous state-of-the-art results for both entity extraction and event extraction on the widely used ACE2005 dataset.
| Year | Citations | |
|---|---|---|
Page 1
Page 1