Publication | Open Access
Towards an integration of deep learning and neuroscience
681
Citations
396
References
2016
Year
Neuroscience has focused on the detailed implementation of computation,\nstudying neural codes, dynamics and circuits. In machine learning, however,\nartificial neural networks tend to eschew precisely designed codes, dynamics or\ncircuits in favor of brute force optimization of a cost function, often using\nsimple and relatively uniform initial architectures. Two recent developments\nhave emerged within machine learning that create an opportunity to connect\nthese seemingly divergent perspectives. First, structured architectures are\nused, including dedicated systems for attention, recursion and various forms of\nshort- and long-term memory storage. Second, cost functions and training\nprocedures have become more complex and are varied across layers and over time.\nHere we think about the brain in terms of these ideas. We hypothesize that (1)\nthe brain optimizes cost functions, (2) these cost functions are diverse and\ndiffer across brain locations and over development, and (3) optimization\noperates within a pre-structured architecture matched to the computational\nproblems posed by behavior. Such a heterogeneously optimized system, enabled by\na series of interacting cost functions, serves to make learning data-efficient\nand precisely targeted to the needs of the organism. We suggest directions by\nwhich neuroscience could seek to refine and test these hypotheses.\n
| Year | Citations | |
|---|---|---|
Page 1
Page 1