Smart Object Extraction from a Picture and Blending into Another

Lischinski Daniel, HUJI, School of Computer Science and Engineering, Computer Science


A new efficient method for recovering reliable local sets of dense correspondences between two images with shared content.

The method could be applied for:

  • automatic adjusting of the source’ tonal characteristics to match a reference,
  • transferring a known mask to a new image,
  • kernel estimation for image de-blurring


The approach simultaneously recovers both a robust set of dense correspondences between sufficiently similar regions in two images and a global non-linear parametric color transformation model. A new coarse-to-fine scheme is utilized in which nearest-neighbor field computations using Generalized PatchMatch [Barnes et al. 2010] are interleaved with fitting a global non-linear parametric color model and aggregating consistent matching regions using locally adaptive constraints.

Below is an example of color transfer.


The reference image (a) was taken indoors using a flash, while the source image (b) was taken outdoors, against a completely different background, and under natural illumination. The correspondence algorithm detects parts of the woman’s face and dress as shared content (c), and fits a parametric color transfer model (d). The appearance of the woman in the result (e) matches the reference (a).


  • New correspondence method that combines dense local matching with robustness to outliers.
  • Identification of correspondences between non-rigid objects with significant variance in their appearance characteristics, including dramatically different pose, lighting, viewpoint and sharpness.
  • Synergy of two worlds advantages:  dense, like optical flow and stereo reconstruction methods, and robust to geometric and photometric variations, like sparse feature matching.


The method for computing a reliable dense set of correspondences between two images specifically designed to handle a third scenario, where the input images share some common content, but may differ significantly due to a variety of factors, such as non-rigid changes in the scene, changes in lighting and/or tone mapping, and different cameras and lenses. This scenario often arises in personal photo albums, which typically contain repeating subjects photographed under different conditions.

Patent Status

Granted US 9,014,470

Contact for more information:

Anna Pellivert
Contact ME: