Learning to Predict Stable Overlapping Memories from Spiking Neural Networks
Type and duration:
Bachelor project, flexible duration.
Graph theory (GT)  is a useful framework to analyze relations between objects and to model complex systems, from proteins to social networks, through what is called graphs. Metrics from GT can be used to characterize different types of graphs and the systems they are an abstraction of, making it a suitable framework to analyze complex networks. This project will deal with the usage of Machine Learning (ML) and GT to train regression models to predict the maximum overlap between patterns that can be supported by Recurrent Competitive Networks (RCNs)  as a function of their topological features.
- Review of ML regression models and metrics.
- Review of graph metrics.
- Training of regression model on graphs feature vectors.
- Performance assessment of the trained regression model.
- Feature selection (dimensionality reduction).
- Training with reduced feature set.
- Performance assessment of the updated regression model.
Python programming, basic machine learning (optional).
- R. Wilson, Introduction to Graph Theory. Longman, 2010
- F. Jug, M. Cook, and A. Steger, “Recurrent competitive networks can learn locally excitatory topologies,” pp. 1–8, 06 2012
|Last modified:||11 October 2022 11.23 a.m.|