The invention relates to a scene flow
estimation method based on 3D local rigidity and
depth map guided anisotropic
smoothing. The method comprises the following steps: S1, acquiring a texture image and a depth image which are aligned at the same time through an RGB-D sensor; S2, building a scene flow
estimation energy functional, and calcualting a dense scene flow based on a 3D local rigidity surface
hypothesis and a global constrained method, wherein the form of a scene flow energy function is shown in the description; S3, designing date items based on the texture image and the depth image as well as the 3D local rigidity surface
hypothesis; S4, designing
smoothing items based on a
depth map driven
anisotropic diffusion tensor and total variation regularization; S5, creating an image
pyramid, and adopting a coarse-to-fine solution strategy; and S6, calculating a scene flow by use of a duality method, and introducing scene flow auxiliary variables. According to the invention, the weight of a space-domain filter is determined by both the
chromatic aberration and location relationship between the pixels of a
color image, and therefore, the edge
distortion problem in the process of repair is solved. In order to reduce repair error, the weight of a value-domain filter is determined by the color information and the
structural similarity coefficient.