NLP: Multi Level Predictive Language Model
Simply speaking, the goal of grammar learning is to learn the right order of words in a sentence. This assumption is the fundamental statement of modern, predictive language models. Early language models operate with words as the atomic representation units. Novel approaches, e.g., BERT is based on hyphens. The goal of this project is to broaden the scope in both directions, to work with characters and to to work with expressions. Basically, the prediction is to be conducted on different levels of the language, while these levels are also interconnected. The student will work on a novel architecture of artificial neural networks and will compare its performance to state of the art techniques.