Motion blur for motion segmentation
Published in IEEE International Conference on Image Processing, 2013
Paramanand Chandramouli, Ambasamudram Narayanan Rajagopalan
Abstract
In this paper, we develop a method for motion segmentation using blur kernels. A blur kernel represents the apparent motion undergone by a scene point in the image plane. When the relative motion between the camera and scene is not restricted to fronto-parallel translations, the shape of the blur kernels can vary across image points. For a dynamic scene, we effectively model motion blur using transformation spread functions (TSFs) which represent the relative motions. Given a set of blur kernels that are estimated at different points across an image, we develop a method to segment them according to their relative motion. We initially group the blur kernels based on their `compatibility’. We refine this initial segmentation by jointly estimating the TSF and removing the outliers.
Resources
Bibtex
@inproceedings{paramanand2013motion, title={Motion blur for motion segmentation}, author={Paramanand, Chandramouli and Rajagopalan, AN}, booktitle={2013 IEEE International Conference on Image Processing}, pages={4244–4248}, year={2013}, organization={IEEE} }