A divergence-based condition to ensure quantile improvement in black-box global optimization - Hub Intelligence Artificielle de CentraleSupélec Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2024

A divergence-based condition to ensure quantile improvement in black-box global optimization

Résumé

Black-box global optimization aims at minimizing an objective function whose analytical form is not known. To do so, many state-of-the-art methods rely on sampling-based strategies, where sampling distributions are built in an iterative fashion, so that their mass concentrate where the objective function is low. Despite empirical success, the theoretical study of these methods remains difficult. In this work, we introduce a new framework, based on divergence-decrease conditions, to study and design blackbox global optimization algorithms. Our approach allows to establish and quantify the improvement of proposals at each iteration, in terms of expected value or quantile of the objective. We show that the information-geometric optimization approach fits within our framework, yielding a new approach for its analysis. We also establish proposal improvement results for two novel algorithms, one related with the cross-entropy approach with mixture models, and another one using heavy-tailed sampling proposal distributions.
Fichier principal
Vignette du fichier
divergenceBasedCondQuantileImprovement.pdf (365.04 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04616771 , version 1 (19-06-2024)

Licence

Identifiants

  • HAL Id : hal-04616771 , version 1

Citer

Thomas Guilmeau, Emilie Chouzenoux, Víctor Elvira. A divergence-based condition to ensure quantile improvement in black-box global optimization. 2024. ⟨hal-04616771⟩
0 Consultations
0 Téléchargements

Partager

Gmail Mastodon Facebook X LinkedIn More