Exit Menu

Optimisation methods for (deep) neural networks

A machine learning algorithm can be viewed as a mapping between inputs and outputs, with the scope of optimising and tuning the parameters that describe it so that the performance is improved. The loss function (or cost function) that results and has to be optimised is non-convex in the case of artificial neural networks (which includes deep learning as well) and has multiple minima. Finding the global minimum of such a function is not a simple task as it is an NP-complete problem. The project will investigate existing optimisation methods as well as development of new ones for finding the local minima, the saddle points (of index one) and the global minima of the loss function characterising non-trivial deep neural networks. The focus will be on pupulation-based methods such as variants of genetic algorithms, differential evolution, etc.

This is a self funded project

Brunel offers a number of funding options to research students that help cover the cost of their tuition fees, contribute to living expenses or both. See more information here: https://www.brunel.ac.uk/research/Research-degrees/Research-degree-funding. Recently the UK Government made available the Doctoral Student Loans of up to £25,000 for UK and EU students and there is some funding available through the Research Councils. Many of our international students benefit from funding provided by their governments or employers. Brunel alumni enjoy tuition fee discounts of 15%.)