Publication | Closed Access
Saliency Optimization from Robust Background Detection
1.4K
Citations
26
References
2014
Year
Unknown Venue
Scene AnalysisMachine VisionImage AnalysisEngineeringScene InterpretationPattern RecognitionObject DetectionVideo ProcessingMedical Image ComputingSaliency OptimizationScene UnderstandingBoundary PriorSalient Object DetectionBoundary ConnectivityDeep LearningVision RecognitionComputer Vision
Recent progress in salient object detection has leveraged boundary prior/background information to aid saliency cues, yet existing methods employ it in a simple, fragile, and largely heuristic manner. This study introduces novel methods to overcome these limitations. The authors propose a robust boundary connectivity measure that captures spatial layout relative to image boundaries and a principled optimization framework that integrates this measure with other low‑level cues to produce clean, uniform saliency maps. The proposed measure offers an intuitive geometrical interpretation and unique advantages over prior saliency measures, and the overall formulation is efficient, intuitive, and achieves state‑of‑the‑art performance on multiple benchmark datasets.
Recent progresses in salient object detection have exploited the boundary prior, or background information, to assist other saliency cues such as contrast, achieving state-of-the-art results. However, their usage of boundary prior is very simple, fragile, and the integration with other cues is mostly heuristic. In this work, we present new methods to address these issues. First, we propose a robust background measure, called boundary connectivity. It characterizes the spatial layout of image regions with respect to image boundaries and is much more robust. It has an intuitive geometrical interpretation and presents unique benefits that are absent in previous saliency measures. Second, we propose a principled optimization framework to integrate multiple low level cues, including our background measure, to obtain clean and uniform saliency maps. Our formulation is intuitive, efficient and achieves state-of-the-art results on several benchmark datasets.
| Year | Citations | |
|---|---|---|
Page 1
Page 1