Flexible transfer learning framework for bayesian optimisation


Autoria(s): Joy, Tinu Theckel; Rana, Santu; Gupta, Sunil Kumar; Venkatesh, Svetha
Contribuinte(s)

Bailey, James

Khan, Latifur

Washio, Takashi

Dobbie, Gillian

Huang, Joshua Zhexue

Wang, Ruili

Data(s)

01/01/2016

Resumo

Bayesian optimisation is an efficient technique to optimise functions that are expensive to compute. In this paper, we propose a novel framework to transfer knowledge from a completed source optimisation task to a new target task in order to overcome the cold start problem. We model source data as noisy observations of the target function. The level of noise is computed from the data in a Bayesian setting. This enables flexible knowledge transfer across tasks with differing relatedness, addressing a limitation of the existing methods. We evaluate on the task of tuning hyperparameters of two machine learning algorithms. Treating a fraction of the whole training data as source and the whole as the target task, we show that our method finds the best hyperparameters in the least amount of time compared to both the state-of-art and no transfer method.

Identificador

http://hdl.handle.net/10536/DRO/DU:30083252

Idioma(s)

eng

Publicador

Springer

Relação

http://dro.deakin.edu.au/eserv/DU:30083252/gupta-flexibletransfer-evid-2016.pdf

http://dro.deakin.edu.au/eserv/DU:30083252/joy-flexibletransfer-2016.pdf

http://www.dx.doi.org/10.1007/978-3-319-31753-3_9

Direitos

2016, Springer

Tipo

Book Chapter