Learning to Predict Memory Robustness from Neural Networks
Type and duration:
Bachelor project, flexible duration.
Graph theory (GT) is a useful framework to analyze relations between objects and to model complex systems, from proteins to social networks, through what is called graphs. Metrics from GT can be used to characterize different types of graphs and the systems they are an abstraction of. This project will deal with the usage of Machine Learning (ML)  to train classifiers to predict the memory robustness from graph representations derived from a Recurrent Spiking Neural Network  modeling Working Memory.
- Review of machine learning models and metrics.
- Review of graph metrics.
- Training of classifiers in a toy problem.
- Training of classifiers on graphs feature vectors.
- Performance assessment of the trained classifier.
Python programming, basic machine learning (optional).
- Jug, F., Cook, M., & Steger, A. (2012). Recurrent Competitive Networks Can Learn Locally Excitatory Topologies. The 2012 International Joint Conference on Neural Networks (IJCNN), 1, 1–8.
|Last modified:||05 April 2022 10.54 a.m.|