## Abstract

In this paper, we analyze the minimization of seminorms ∥L · ∥ on under the constraint of a bounded I-divergence D(b, H · ) for rather general linear operators H and L. The I-divergence is also known as Kullback-Leibler divergence and appears in many models in imaging science, in particular when dealing with Poisson data but also in the case of multiplicative Gamma noise. Often H represents, e.g., a linear blur operator and L is some discrete derivative or frame analysis operator. A central part of this paper consists in proving relations between the parameters of I-divergence constrained and penalized problems. To solve the I-divergence constrained problem, we consider various first-order primal-dual algorithms which reduce the problem to the solution of certain proximal minimization problems in each iteration step. One of these proximation problems is an I-divergence constrained least-squares problem which can be solved based on Morozov's discrepancy principle by a Newton method. We prove that these algorithms produce not only a sequence of vectors which converges to a minimizer of the constrained problem but also a sequence of parameters which converges to a regularization parameter so that the corresponding penalized problem has the same solution. Furthermore, we derive a rule for automatically setting the constraint parameter for data corrupted by multiplicative Gamma noise. The performance of the various algorithms is finally demonstrated for different image restoration tasks both for images corrupted by Poisson noise and multiplicative Gamma noise.

Original language | English |
---|---|

Article number | 035007 |

Number of pages | 28 |

Journal | Inverse Problems |

Volume | 29 |

Issue number | 3 |

DOIs | |

Publication status | Published - Mar 2013 |

Externally published | Yes |

## Fingerprint

Dive into the research topics of 'Minimization and parameter estimation for seminorm regularization models with*I*-divergence constraints'. Together they form a unique fingerprint.