Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Interactive time-space accordant video matting method in digital video processing

A digital video and interactive technology, applied in digital video signal modification, image data processing, television and other directions, it can solve the problems of discontinuous video results and long processing time in frame-by-frame processing.

Inactive Publication Date: 2008-01-02
ZHEJIANG UNIV
View PDF0 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] The purpose of the present invention is to overcome the disadvantages of discontinuous results and long processing time brought by frame-by-frame processing of video in the prior art, and provide an interactive fast video matting method with consistent time and space

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Interactive time-space accordant video matting method in digital video processing
  • Interactive time-space accordant video matting method in digital video processing
  • Interactive time-space accordant video matting method in digital video processing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] The flow chart of the fast video matting method of interactive spatiotemporal consistency in the digital video processing of the present invention is shown in Fig. 1, comprises the following steps:

[0044] The first step is interactive volume diffusion: the interaction is carried out on the 3D video volume. The interaction of the video volume is shown in Figure 2. The 3D video volume is composed of each frame of the video in time sequence, including the x-axis, y-axis and time axis Time .

[0045] 1) The user interacts on the 3D video volume. The user can rotate, slice and segment the video volume arbitrarily, and mark and divide the foreground area, background area and unknown area on the video volume with strokes of different colors. Figure 2(1) is a slice in the video volume, and Figure 2(2) is marking on the slice. In the figure, F is the foreground area, B is the background area, and U is the unknown area.

[0046] 2) Record the set of points in the undetermined ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video image pickup method with consistent time and space domains, of digit video process, which comprises that (1), interactive disperse that user interacts on a three-dimension video element, and obtains a shield via disperse calculation, and divides video element into foreground area, background area, and unknown area, (2), picking iterated video image that calculating out the alpha value of each point on the three-dimension video element as non-transparency, (3), foreground reconstruction with consistent time and space domains that according to the alpha value, calculating out the foreground color value with continuity at time and space domains. The inventive method can quickly, effectively, and high-quality pickup image of video, which can resolve the problems of prior video pickup method as discontinuous result and long process time, with high practicality.

Description

technical field [0001] The invention relates to an interactive time-space consistent video cutout method in digital video processing. Background technique [0002] Image matting is a digital image processing technology that separates the foreground part of the image from the background. It is widely used in the production of film and television special effects. According to the processing object, image matting can generally be divided into two categories, namely image matting and video matting. The image matting problem can be defined as: Given any picture, find the foreground color F, background color B and alpha (alpha) value α contained in the color value I of each point on the image. The alpha value refers to the opacity of the pixel, and the relationship between them is as follows: I=αF+(1-α)B. The set of alpha values ​​corresponding to the original image, that is, the alpha values ​​of all points, is called a silhouette. On video, this problem can be naturally exten...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N7/26G06T15/20H04N19/85
Inventor 夏佳志丁子昂管宇陈为彭群生
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products