Re-examining linear embeddings for high-dimensional Bayesian optimization
Research output: Contribution to journal › Conference article › Contributed › peer-review
Contributors
Abstract
Bayesian optimization (BO) is a popular approach to optimize expensive-to-evaluate black-box functions. A significant challenge in BO is to scale to high-dimensional parameter spaces while retaining sample efficiency. A solution considered in existing literature is to embed the high-dimensional space in a lower-dimensional manifold, often via a random linear embedding. In this paper, we identify several crucial issues and misconceptions about the use of linear embeddings for BO. We study the properties of linear embeddings from the literature and show that some of the design choices in current approaches adversely impact their performance. We show empirically that properly addressing these issues significantly improves the efficacy of linear embeddings for BO on a range of problems, including learning a gait policy for robot locomotion.
Details
Original language | English |
---|---|
Journal | Advances in neural information processing systems : ... proceedings of the ... conference |
Volume | 2020-December |
Publication status | Published - 2020 |
Peer-reviewed | Yes |
Conference
Title | 34th Conference on Neural Information Processing Systems, NeurIPS 2020 |
---|---|
Duration | 6 - 12 December 2020 |
City | Virtual, Online |
External IDs
ORCID | /0000-0001-9430-8433/work/146646287 |
---|