A shape preserving approach for salient object detection using convolutional neural networks

Jongpil Kim, Vladimir Pavlovic

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Scopus citations

Abstract

Determining visual saliency is one of the fundamental problems in computer vision as the saliency not only identifies the most informative parts of a visual scene but may also reduce computational complexity by filtering out irrelevant segments of the scene. In this paper, we propose a novel saliency object detection method that combines a shape-preserving saliency prediction driven by a convolutional neural network with the mid and low-level region preserving image information. Our model learns a saliency shape dictionary, which is subsequently used to train a CNN to predict the salient class of a target region and estimate the full but coarse saliency map of the target image. The map is then refined using image specific low-to-mid level information. Performance evaluation on popular benchmark datasets shows that the proposed method outperforms existing state-of-the-art methods in saliency detection.

Original languageEnglish (US)
Title of host publication2016 23rd International Conference on Pattern Recognition, ICPR 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages609-614
Number of pages6
ISBN (Electronic)9781509048472
DOIs
StatePublished - Jan 1 2016
Event23rd International Conference on Pattern Recognition, ICPR 2016 - Cancun, Mexico
Duration: Dec 4 2016Dec 8 2016

Publication series

NameProceedings - International Conference on Pattern Recognition
Volume0

Other

Other23rd International Conference on Pattern Recognition, ICPR 2016
Country/TerritoryMexico
CityCancun
Period12/4/1612/8/16

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'A shape preserving approach for salient object detection using convolutional neural networks'. Together they form a unique fingerprint.

Cite this