Predictive coding is a consequence of energy efficiency in recurrent neural networks
Autor(en): | Ali, Abdullahi Ahmad, Nasir de Groot, Elgar van Gerven, Marcel Antonius Johannes Kietzmann, Tim Christian |
Stichwörter: | ACTION-POTENTIALS; BRAIN; Computer Science; Computer Science, Artificial Intelligence; Computer Science, Information Systems; Computer Science, Interdisciplinary Applications; NEOCORTEX; NEURONS; PERCEPTION; SLOW FEATURE ANALYSIS; STATISTICS | Erscheinungsdatum: | 2022 | Herausgeber: | CELL PRESS | Enthalten in: | PATTERNS | Band: | 3 | Ausgabe: | 12 | Zusammenfassung: | Predictive coding is a promising framework for understanding brain function. It postulates that the brain continuously inhibits predictable sensory input, ensuring preferential processing of surprising elements. A central aspect of this view is its hierarchical connectivity, involving recurrent message passing between excitatory bottom-up signals and inhibitory top-down feedback. Here we use computational modeling to demonstrate that such architectural hardwiring is not necessary. Rather, predictive coding is shown to emerge as a consequence of energy efficiency. When training recurrent neural networks to minimize their energy consumption while operating in predictive environments, the networks self-organize into prediction and error units with appropriate inhibitory and excitatory interconnections and learn to inhibit predictable sensory input. Moving beyond the view of purely top-down-driven predictions, we demonstrate, via virtual lesioning experiments, that networks perform predictions on two timescales: fast lateral predictions among sensory units and slower prediction cycles that integrate evidence over time. |
ISSN: | 2666-3899 | DOI: | 10.1016/j.patter.2022.100639 |
Zur Langanzeige
Seitenaufrufe
5
Letzte Woche
0
0
Letzter Monat
0
0
geprüft am 06.06.2024