Color image deblurring with impulsive noise

Leah Bar*, Alexander Brook, Nir Sochen, Nahum Kiryati

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

10 Scopus citations

Abstract

We propose a variational approach for deblurring and impulsive noise removal in multi-channel images. A robust data fidelity measure and edge preserving regularization are employed. We consider several regularization approaches, such as Beltrami flow, Mumford-Shah and Total-Variation Mumford-Shah. The latter two methods are extended to multi-channel images and reformulated using the Γ- convergen ce approximation. Our main contribution is in the unification of image deblurring and impulse noise removal in a multi-channel variational framework. Theoretical and experimental results show that the Mumford-Shah and Total Variation Mumford Shah regularization methods are superior to other color image restoration regularizers. In addition, these two methods yield a denoised edge map of the image.

Original languageEnglish
Title of host publicationVariational, Geometric, and Level Set Methods in Computer Vision - Third International Workshop, VLSM 2005, Proceedings
PublisherSpringer Verlag
Pages49-60
Number of pages12
ISBN (Print)3540293485, 9783540293484
DOIs
StatePublished - 2005
Event3rd International Workshop on Variational, Geometric, and Level Set Methods in Computer Vision, VLSM 2005 - Beijing, China
Duration: 16 Oct 200516 Oct 2005

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume3752 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference3rd International Workshop on Variational, Geometric, and Level Set Methods in Computer Vision, VLSM 2005
Country/TerritoryChina
CityBeijing
Period16/10/0516/10/05

Fingerprint

Dive into the research topics of 'Color image deblurring with impulsive noise'. Together they form a unique fingerprint.

Cite this