Abstract
Recent advancements in the realm of deep generative models focus on generating samples that satisfy multiple desired properties. However, prevalent approaches optimize these property functions independently, thus omitting the trade-offs among them. In addition, the property optimization is often improperly integrated into the generative models, resulting in an unnecessary compromise on generation quality (i.e., the quality of generated samples). To address these issues, we formulate a constrained optimization problem. It seeks to optimize generation quality while ensuring that generated samples reside at the Pareto front of multiple property objectives. Such a formulation enables the generation of samples that cannot be further improved simultaneously on the conflicting property functions and preserves good quality of generated samples.Building upon this formulation, we introduce the ParetO-gUided Diffusion model (PROUD), wherein the gradients in the denoising process are dynamically adjusted to enhance generation quality while the generated samples adhere to Pareto optimality.
Original language | English |
---|---|
Pages (from-to) | 6511-6538 |
Number of pages | 28 |
Journal | Machine Learning |
Volume | 113 |
Issue number | 9 |
Early online date | 2 Jul 2024 |
DOIs | |
Publication status | Published - Sept 2024 |
Bibliographical note
Publisher Copyright:© The Author(s), under exclusive licence to Springer Science+Business Media LLC, part of Springer Nature 2024.
Keywords
- Diffusion model
- Generative model
- Multi-objective generation
- Pareto optimality