Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Color video matting method based on depth foreground tracking

A color video and color image technology, applied in the field of color video matting based on depth foreground tracking, can solve the problems that the processing time cannot meet the real-time requirements, the manual interaction workload is large, the local edge artifacts, etc., to ensure the continuity of space and time. performance, low computational cost, and simplified operation

Active Publication Date: 2017-12-15
长春长光启衡传感技术有限公司
View PDF8 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantages of the existing video matting strategy are: some key frames in the video sequence need to be marked, and the workload of manual interaction is heavy; the processing time cannot meet the real-time requirements, and the algorithm is often post-processing; affected by the propagation strategy, local Artifacts and jumps at edges

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Color video matting method based on depth foreground tracking
  • Color video matting method based on depth foreground tracking
  • Color video matting method based on depth foreground tracking

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The present invention will be described in further detail below in conjunction with the accompanying drawings and embodiments.

[0019] A color video matting method based on deep foreground tracking, such as figure 1 As shown in Fig. 1, before starting the image matting, the user specifies the approximate area of ​​the foreground on the depth image to clarify the search range and improve the accuracy of matting. Then in the search box, the foreground object is segmented by using the depth information difference between the foreground and the background. Reconstruct the depth image to the resolution of the color image and perform registration so that the two correspond pixel by pixel to generate a three-part map; according to the color information of the color image, finely adjust the three-part map so that the value in the three-part map is Pixels with a value of 1 correspond to the foreground of the color image, pixels with a value of 0 correspond to the background of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention proposes a color video matting method based on depth foreground tracking, which relates to the field of digital image processing. The method comprises: before the starting of the matting, defining a clear search area by a user at a designated foreground area of a depth image; then in the search box, using the depth information difference of the foreground, segmenting the foreground targets; reconstructing the depth image to the resolution of the color image and performing the registration so that the pixels of the two gradually correspond to each other and generating a three-part map; according to the color information of the color image, making the refined adjustment to the three-part map; based on the three-part map and the color image, using the super-pixel gradient for fast foreground extraction; and finally according to the extracted foreground position, establishing and updating the motion equation; predicting the foreground center of the next frame; and updating the search box position. With the above steps, the video matting can be completed frame by frame. Compared with the current video matting algorithm, the invention greatly simplifies the operation, has low complexity of the algorithm, and realizes the real-time and accurate video foreground matting.

Description

technical field [0001] The invention relates to the field of digital image processing, in particular to a color video matting method based on depth foreground tracking. Background technique [0002] Video matting is an extension of digital image matting on video, that is, in each frame, the foreground, background and transparency are calculated separately, the foreground object is cut out from the video background, and then it can be combined with any background image to create a Realistic scene change effect. [0003] At present, real-time video session cutouts often require a pure green curtain as the background, and the characters are cut out and combined with other backgrounds. However, video matting under complex backgrounds is still at the stage of laboratory research, and requires the use of key frames or 3D space-time labeling methods. The interaction is more complicated and cannot be completed in real time, but can only be post-processed on video files. In the key...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/215G06T7/207G06T7/194
CPCG06T7/194G06T7/207G06T7/215G06T2207/10016
Inventor 王灿进孙涛王挺峰王锐陈飞田玉珍
Owner 长春长光启衡传感技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products