Skip to ContentSkip to Navigation
About us Faculty of Science and Engineering Our Research CogniGron

CogniGron Seminar: James Smith (University of Wisconsin-Madison, USA) - "A Temporal Neural Network Architecture for Online Learning"

When:Mo 15-02-2021 16:00 - 17:00
Where:Online

A long-standing proposition is that by emulating the operation of the brain’s neocortex, a spiking neural network (SNN) can achieve similar desirable features: flexible learning, speed, and efficiency. Temporal neural networks (TNNs) are SNNs that communicate and process information encoded as relative spike times (in contrast to spike rates). A TNN architecture is proposed, and as a proof-of-concept, TNN operation is demonstrated within the larger context of online supervised classification. First, through unsupervised learning, a TNN partitions input patterns into clusters based on similarity. The TNN learning process adjusts synaptic weights by using only signals local to each synapse, and global clustering behavior emerges. The TNN then passes a cluster identifier to a simple online supervised decoder which finishes the classification task. Besides features of the overall architecture, several TNN components and methods are new to this work. A long term research objective is a direct hardware implementation. Consequently, the architecture is described at a level analogous to the gate and register transfer levels used in conventional digital design, and processing is done at very low precision.

About James Smith
James E. Smith is Professor Emeritus in the Department of Electrical and Computer Engineering at the University of Wisconsin-Madison. He received his PhD from the University of Illinois in 1976. He then joined the faculty of the University of Wisconsin-Madison, teaching and conducting research ΜΆ first in fault-tolerant computing, then in computer architecture. He has been involved in a number of computer research and development projects both as a faculty member at Wisconsin and in industry.

Prof. Smith made a number of contributions to the development of superscalar processors. These contributions include basic mechanisms for dynamic branch prediction and implementing precise traps. He has also studied vector processor architectures and worked on the development of innovative microarchitecture paradigms. He received the 1999 ACM/IEEE Eckert-Mauchly Award for these contributions.

For the past several years, he has been studying neuron-based computing paradigms at home along the Clark Fork near Missoula, Montana.

More information about James Smith