I started my PhD in the Probabilistic Numerics Group (September 2017)
My work aims at riding the field of Deep Learning of annoying hyperparameters and thus automate the training of Deep Neural Networks.
Nowadays, people trying to train a neural network need to try out different combinations of learning rate, batch size, etc. to find a combination that "works" and then start training with this "magic combination". A lot of computational and human resources is wasted on finding these hyperparameters, which why there is great need in automating these hyperparameters choices.
One idea is to replace the hand-tuned learning rate of SGD (used to train the neural net) with a probabilistic line search (Mahsereci and Hennig, 2017). During the next months, I will continue this line of work, to make the probabilistic line search a practical tool.
I recently joined the Probabilistic Numerics Group (September 2017) as part of the IMPRS-IS (International Max Planck Research School for Intelligent Systems) to work on Optimization methods for machine learning.
Prior to joining the MPI, I studied in Simulation Technology (B.Sc. and M.Sc.) and Industrial and Applied Mathematics (M.Sc) at the University of Stuttgart and the Technische Universiteit Eindhoven respectively. My Master's thesis was on constructing preconditioners for Toeplitz matrices. This project was done at ASML (Eindhoven), a company developing lithography system for the semiconductor industry.
Optimization problems arising in intelligent systems are similar to those studied in other fields (such as operations research, control, and computational physics), but they have some prominent features that set them apart, and which are not addressed by classic optimization methods. Numerical optimization is a domain...
Our goal is to understand the principles of Perception, Action and Learning in autonomous systems that successfully interact with complex environments and to use this understanding to design future systems