Re-examining linear embeddings for high-dimensional Bayesian optimization

Research output: Contribution to journalConference articleContributedpeer-review

Contributors

  • Benjamin Letham - , Meta Platforms, Inc. (Author)
  • Roberto Calandra - , Meta Platforms, Inc. (Author)
  • Akshara Rai - , Meta Platforms, Inc. (Author)
  • Eytan Bakshy - , Meta Platforms, Inc. (Author)

Abstract

Bayesian optimization (BO) is a popular approach to optimize expensive-to-evaluate black-box functions. A significant challenge in BO is to scale to high-dimensional parameter spaces while retaining sample efficiency. A solution considered in existing literature is to embed the high-dimensional space in a lower-dimensional manifold, often via a random linear embedding. In this paper, we identify several crucial issues and misconceptions about the use of linear embeddings for BO. We study the properties of linear embeddings from the literature and show that some of the design choices in current approaches adversely impact their performance. We show empirically that properly addressing these issues significantly improves the efficacy of linear embeddings for BO on a range of problems, including learning a gait policy for robot locomotion.

Details

Original languageEnglish
Number of pages13
JournalAdvances in neural information processing systems : ... proceedings of the ... conference
Volume33
Publication statusPublished - 2020
Peer-reviewedYes
Externally publishedYes

Conference

Title34th Conference on Neural Information Processing Systems
Abbreviated titleNeurIPS 2020
Conference number34
Duration6 - 12 December 2020
LocationOnline

External IDs

ORCID /0000-0001-9430-8433/work/146646287