Concepedia

Publication | Open Access

Comparative Study of CNN and RNN for Natural Language Processing

894

Citations

17

References

2017

Year

TLDR

Deep neural networks have transformed NLP, with convolutional and recurrent architectures competing as leading models for extracting position‑invariant features and modeling sequential data, respectively, and the state of the art frequently shifting between them. This work is the first systematic comparison of CNN and RNN on a wide range of representative NLP tasks, aiming to give basic guidance for DNN selection. The authors systematically compare CNN and RNN across a broad set of representative NLP tasks to provide guidance on DNN selection.

Abstract

Deep neural networks (DNN) have revolutionized the field of natural language processing (NLP). Convolutional neural network (CNN) and recurrent neural network (RNN), the two main types of DNN architectures, are widely explored to handle various NLP tasks. CNN is supposed to be good at extracting position-invariant features and RNN at modeling units in sequence. The state of the art on many NLP tasks often switches due to the battle between CNNs and RNNs. This work is the first systematic comparison of CNN and RNN on a wide range of representative NLP tasks, aiming to give basic guidance for DNN selection.

References

YearCitations

Page 1