Skip to ContentSkip to Navigation
About us Faculty of Science and Engineering Our Research CogniGron

CogniGron@Work: Lambert Schomaker - "How to bridge the gap between current deep learning and current neuromorphic computing?"

When:Fr 18-12-2020 13:00 - 14:00
Where:Online


It has become clear that the researchers in neuromorphic computing focus on training methods for multi-layer perceptrons pointing to a large number of layers in a network as the explanation for the 'deep' aspect of neural computing. However, the success of deep learning in current artificial neural networks (CNNs) is due to many factors, the most important of them being the use of 2D convolutions. This means that each layer consists of trainable filters. This requires that a square patch of pixels is dynamically slid over a large 2D field (image) with a stride of 1 pixel in both directions, performing the weight matrix times vector multiplication, at each x,y position. For each hidden unit, there is also a memory field (the feature map), storing and delivering the input to the filters of the next layer. Such a dynamic functionality heavily relies on Turing/von Neuman controllers outside of the core of the neuromorphic weight-update hardware (e.g., a crossbar with memristors). These complicating facts diminish the poignancy of the 'low energy' argument: There are more computations needed than the multiply-add operator. Consequently, there should be a strong need to also address trainable filters, in materials science. As an example, a tedious convolution in x,y can be replaced by an effective one-shot optical filter over a complete 2D plane. For electric variants of neuromorphic computing, a similar wide-field lensing would need to be implemented. Intermediate-representation images at the level of a hidden unit require persistence, at least for the time period that a filter in the next layer is receiving the (usually 2D) pattern. Only when this is addressed can we realize a complete emulation of current deep learning in novel hardware, in CogniGron.

How to join:
Weblink will follow