Sampling-based optimization with mixtures

R. Bardenet 1 Balázs Kégl 1, 2, 3
2 TAO - Machine Learning and Optimisation
LRI - Laboratoire de Recherche en Informatique, UP11 - Université Paris-Sud - Paris 11, Inria Saclay - Ile de France, CNRS - Centre National de la Recherche Scientifique : UMR8623
Abstract : Sampling-based Evolutionary Algorithms (EA) are of great use when dealing with a highly non-convex and/or noisy optimization task, which is the kind of task we often have to solve in Machine Learning. Two derivative-free examples of such methods are Estimation of Distribution Algorithms (EDA) and techniques based on the Cross-Entropy Method (CEM). One of the main problems these algorithms have to solve is finding a good surrogate model for the normalized target function, that is, a model which has sufficient complexity to fit this target function, but which keeps the computations simple enough. Gaussian mixture models have been applied in practice with great success, but most of these approaches lacked a solid theoretical founding. In this paper we describe a sound mathematical justification for Gaussian mixture surrogate models, more precisely we propose a proper derivation of an EDA/CEM algorithm with mixture updates using Expectation Maximization techniques. It will appear that this algorithm resembles the recent Population MCMC schemes, thus reinforcing the link between Monte- Carlo integration methods and sampling-based optimization. We will concentrate throughout this paper on continuous optimization.
Complete list of metadatas

Cited literature [7 references]  Display  Hide  Download

http://hal.in2p3.fr/in2p3-00580580
Contributor : Sabine Starita <>
Submitted on : Monday, March 28, 2011 - 3:41:49 PM
Last modification on : Monday, April 9, 2018 - 10:11:19 AM
Long-term archiving on : Wednesday, June 29, 2011 - 2:57:57 AM

File

OPT2009-Bardenet.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : in2p3-00580580, version 1

Collections

Citation

R. Bardenet, Balázs Kégl. Sampling-based optimization with mixtures. OPT 2009: 2nd NIPS Workshop on Optimization for Machine Learning, Dec 2009, Whistler, Canada. ⟨in2p3-00580580⟩

Share

Metrics

Record views

505

Files downloads

372