Overview: Deep neural networks (DNN) have become the de-facto standard tool to carry out complex learningtasks. DNNs belong to the second generation of artificial neural networks (ANNs), which rely on neuronsthat implement memory-less non-linear traNational Science Foundation ormations of the synaptic inputs. Motivated by the biologicalanalogy with the behavior of neurons in the brain, the third generation of neural networks, also referred toas Spiking Neural Networks (SNNs), was introduced in the nineties. In SNNs, synaptic input and neuronaloutput signals are spike trains. This proposal argues that the time for the use of SNNs as machine learningtools has come, and sets forth a systematic approach for the design and implementation of SNNs as learningand inference machines.Intellectual merit: SNNs have a number of unique advantages as compared to ANNs: (i) They are event-basedsystems with natural sparsity properties, which have the potential to make deep learning machines feasible forenergy-limited devices; (ii) They are uniquely capable to natively process data that comes in the form of timeencodedprocesses, for example, from bio-inspired sensors. The main goal of this project is the establishmentof a theoretical framework to enable the design of flexible spike-domain learning algorithms that are tailoredto the solution of supervised and unsupervised cognitive tasks, as well as their co-optimization on nanoscalehardware architectures. To this end, this project puts forth a principled probabilistic framework based on thegraphical formalism of Directed Information Graphs.Broader impact: The outcome of this research is expected to have a profound impact on the increasing numberof practical applications that are based on the processing of time-encoded signals, including biological sensorsand next-generation communication systems, and/or that require the adoption of computing solutions with asignificantly smaller power budget as compared to conventional DNNs. The research methodology is basedon a multi-disciplinary approach that integrates machine learning, information theory, probabilistic graphicalmodels, neuromorphic computing and device/system architecture at the nanoscale. The educational plan atthe home institution targets both undergraduate and graduate students via hands-on learning and experimentationactivities.
|Effective start/end date||8/1/17 → 7/31/20|
- National Science Foundation