Abstract
Recent advancements in the realm of deep generative models focus on generating samples that satisfy multiple desired properties. However, prevalent approaches optimize these property functions independently, thus omitting the trade-offs among them. In addition, the property optimization is often improperly integrated into the generative models, resulting in an unnecessary compromise on generation quality (i.e., the quality of generated samples). To address these issues, we formulate a constrained optimization problem. It seeks to optimize generation quality while ensuring that generated samples reside at the Pareto front of multiple property objectives. Such a formulation enables the generation of samples that cannot be further improved simultaneously on the conflicting property functions and preserves good quality of generated samples.Building upon this formulation, we introduce the ParetO-gUided Diffusion model (PROUD), wherein the gradients in the denoising process are dynamically adjusted to enhance generation quality while the generated samples adhere to Pareto optimality.
Original language | English |
---|---|
Pages (from-to) | 6511-6538 |
Number of pages | 28 |
Journal | Machine Learning |
Volume | 113 |
Issue number | 9 |
Early online date | 2 Jul 2024 |
DOIs | |
Publication status | Published - Sept 2024 |
Bibliographical note
Publisher Copyright:© The Author(s), under exclusive licence to Springer Science+Business Media LLC, part of Springer Nature 2024.
Funding
This work was supported by the National Research Foundation, Singapore and DSO National Laboratories under the AI Singapore Programme (AISG Award No: AISG2-GC-2023-010-T), the A*STAR GAP project (Grant No. I23D1AG079), the A*STAR Career Development Fund (Grant No. C222812019), the A*STAR Pitchfest for ECR 232D800027, the A*STAR Centre for Frontier AI Research, the Program for Guangdong Introducing Innovative and Entrepreneurial Teams (Grant No. 2017ZT07X386), NSFC (Grant No. 62250710682), and the Program for Guangdong Provincial Key Laboratory (Grant No. 2020B121201001).
Keywords
- Diffusion model
- Generative model
- Multi-objective generation
- Pareto optimality