Meta-Learning with Memory-Augmented Neural Network
Memory Augmented Neural Networks (MANNs) are a class of neural networks that utilize an external memory to store the information. The results of MANNs are promising in term of solving long term data dependencies problems in comparison with traditional Recurrent Neural Networks (RNNs). By using an external memory as both input and output, the neural networks gain the ability keep track and retain longer sequence of information. Famous examples of MANNs are Neural Turing Machines (NTM), Differentiable Neural Computer (DNC) and SGDStore neural networks. Known issues in the NTM are the data overwriting issue and the inability to free and reuse the memory. As
an effort to address those issues, the DNC employs differentiable attention mechanisms for reading and writing, On the other hand, the SGDstore utilizes additional neural networks as a data store.
This thesis will build and evaluate various models of MANNs in term of performance on classification and video prediction tasks, similarity functions, and memory efficiency. The objective is to achieve as good as the state of the art but with less memory and miniature model architecture. This thesis will use locality-sensitive-hashing (LSH) to reduce the complexity of calculating the similarity functions over the entire dataset and Reversible Residual Networks (RevNet) to more efficiently use the memory available. The metrics that will be classification metrics over multiple episode training strategies. The target datasets are hand-written character dataset, traffic signs and autonomous driving dataset, and Attari videogames datasets.