Analysis of Motion Blur with a Flutter Shutter Camera for Non- Linear Motion

Stay connected



Share on facebook
Share on twitter
Share on linkedin

CIS Colloquium, Mar 16, 2011, 11:00AM – 12:00PM, Wachman 447

Analysis of Motion Blur with a Flutter Shutter Camera for Non- Linear Motion

Jingyi Yu, University of Delaware

Image blurs confound many computer vision problems. The recently proposed Coded Photography aims to solve the problem by reversibly encoding information about the scene in a single photograph so that the corresponding decoding allows powerful decomposition of the image into light fields, motion deblurred images, differently focused depth layers, and etc. In this talk, I will first present a novel coded photography technique called the fluttered shutter (FS). FS blocks light in time by fluttering the shutter open and closed in carefully chosen binary sequence. We show that by using the FS, we can preserve high spatial frequencies of fast moving objects to support high quality motion deblurring. To deblur the image, existing FS methods assume known constant velocity motions, e.g., via user specifications. In this talk, we extend the FS technique to general 1D motions and develop an automatic motion-from-blur framework by analyzing the image statistics under the FS. We first introduce a fluttered-shutter point-spread-function (FSPSF) to uniformly model the blur kernel under general motions. We show that many commonly used motions have closed-form FS-PSFs. To recover the FS-PSF from the blurred image, we present a new method by analyzing image power spectrum statistics. We show that the Modulation Transfer Function of the 1D FS-PSF is statistically correlated to the blurred image power spectrum along the motion direction. We then recover the FS-PSF by finding the motion parameters that maximize the correlation. We demonstrate our techniques on a variety of motions including constant velocity, constant acceleration, and harmonic rotation. Experimental results show that our method can automatically and accurately recover the motion from the blurs captured under the fluttered shutter.

Jingyi Yu is an Associate Professor in the Department of Computer and Information Sciences at the University of Delaware (UD). He joined the faculty at UD in 2005. Before that, he received my BS with honor from Caltech in 2000 and my Ph.D. in 2005 from MIT in EECS under the supervision of Dr. Leonard McMillan. His research spans a range of areas, with particular interests in computer vision, computational photography, computer graphics, and bio-medical and bioinformatics applications. He has received a number of grant awards, including an NSF CAREER Award in 2009 and an Air Force Young Investigator Award in 2010