Publication | Open Access
Deep Predictive Coding Network with Local Recurrent Processing for\n Object Recognition
35
Citations
0
References
2018
Year
Inspired by "predictive coding" - a theory in neuroscience, we develop a\nbi-directional and dynamic neural network with local recurrent processing,\nnamely predictive coding network (PCN). Unlike feedforward-only convolutional\nneural networks, PCN includes both feedback connections, which carry top-down\npredictions, and feedforward connections, which carry bottom-up errors of\nprediction. Feedback and feedforward connections enable adjacent layers to\ninteract locally and recurrently to refine representations towards minimization\nof layer-wise prediction errors. When unfolded over time, the recurrent\nprocessing gives rise to an increasingly deeper hierarchy of non-linear\ntransformation, allowing a shallow network to dynamically extend itself into an\narbitrarily deep network. We train and test PCN for image classification with\nSVHN, CIFAR and ImageNet datasets. Despite notably fewer layers and parameters,\nPCN achieves competitive performance compared to classical and state-of-the-art\nmodels. Further analysis shows that the internal representations in PCN\nconverge over time and yield increasingly better accuracy in object\nrecognition. Errors of top-down prediction also reveal visual saliency or\nbottom-up attention.\n