Submitted Abstract
The aim of this Ph.D. project is to develop a generic Bayesian optimisation scheme that leverages recent advances in theoretical neurobiology; namely, active inference. This would endow the field of Bayesian optimisation a novel acquisition function based upon optimal Bayesian design — expected free-energy — that should, in principle, outperform its counterparts (e.g. probability of improvement, expected improvement or upper-confidence bounds).In brief, we propose to reformulate any optimisation problem as a process of active inference by treating the evaluation of the objective function (i.e. log likelihood function in Bayesian inversion) as an observation – and the choice of the location in parameter space as an action. Crucially, active inference suggests that Bayes optimal actions are those that minimise expected free energy. By selecting those actions, one resolves the exploration-exploitation dilemma, in relation to the prior preferences endowed to the agent. In the proposed scheme, the priors would be exactly fit for purpose: sampling extreme values of the objective function. This should allow a much more efficient search than current schemes.We propose to provide proofs that expected free energy outperforms its counterparts, that our scheme converges to optima and investigate convergence rates. Practically, we would benchmark against other generic optimisation schemes on practical simulations. Finally, we would conclude with a publicly available software implementation.This work would be carried out between the department of applied mathematics at Imperial College London and Professor Karl Friston’s group at University College London (the father of active inference).To conclude, the potential of this project is a Bayes optimal generic optimisation scheme that is supported by theoretical neurobiology and outperforms its counterparts on a variety of tasks. The implications of this work would be widespread in optimisation, machine learning and robotics, which make extensive use of Bayesian optimisation.