Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video data real-time processing method and device based on adaptive tracking frame segmentation

An adaptive tracking and video data technology, applied in the field of image processing, can solve problems such as poor segmentation effect, regardless of the proportion of foreground image, low processing efficiency and segmentation accuracy

Active Publication Date: 2020-07-28
BEIJING QIHOO TECH CO LTD
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the existing image segmentation method performs scene segmentation processing, it is necessary to perform scene segmentation processing on all the contents of the frame image, the amount of data processing is large, and the processing efficiency is low; in addition, the existing image segmentation method is performing scene segmentation When processing, the proportion of the foreground image in the frame image is not considered, so when the proportion of the foreground image in the frame image is small, it is easy to divide the pixels that actually belong to the edge of the foreground image by using the existing image segmentation method is the background image, the resulting segmentation results have low segmentation accuracy and poor segmentation effect
Therefore, the image segmentation method in the prior art has the problems of a large amount of data processing for image scene segmentation, low processing efficiency and low segmentation accuracy, so the obtained segmentation results cannot be used to accurately and accurately classify the frames in the video. Personalized special effects are added to the image, and the resulting processed video data has a poor display effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video data real-time processing method and device based on adaptive tracking frame segmentation
  • Video data real-time processing method and device based on adaptive tracking frame segmentation
  • Video data real-time processing method and device based on adaptive tracking frame segmentation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0120]Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present disclosure and to fully convey the scope of the present disclosure to those skilled in the art.

[0121] The present invention provides a real-time processing method of video data based on adaptive tracking frame segmentation, considering that during the process of video shooting or video recording, the number of specific objects photographed or recorded may change due to movement and other reasons. Changes, taking a specific object as a human body as an example, the number of photographed or recorded human bodies may increase or...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video data real-time processing method and device based on adaptive tracking box segmentation, a computer device and a computer storage medium. The method comprises the stepsof obtaining a (t)th frame image comprising a specific object in a group of frame images, and a tracking box corresponding to a (t-1)th frame image; adjusting the tracking box corresponding to the(t-1)th frame image according to the (t)th frame image, thereby obtaining the tracking box corresponding to the (t)th frame image; carrying out scene segmentation on a partial area of the (t)th frame image according to the tracking box corresponding to the (t)th frame image, thereby obtaining a segmentation result corresponding to the (t)th frame image; determining a second foreground image of the (t)th frame image according to the segmentation result; adding a personalized special effect according to the second foreground image, thereby obtaining a processed (t)th frame image; covering the (t)thframe image through utilization of the processed (t)th frame image, thereby obtaining processed video data; and displaying the processed video data. According to the technical scheme, the personalized special effect can be added to the frame image relatively precisely and rapidly.

Description

technical field [0001] The present invention relates to the technical field of image processing, in particular to a video data real-time processing method, device, computing device and computer storage medium based on adaptive tracking frame segmentation. Background technique [0002] In the prior art, when the user needs to perform personalized processing such as changing the background of the video, adding special effects, etc., the image segmentation method is often used to perform scene segmentation processing on the frame images in the video. Among them, the image segmentation method based on deep learning is adopted The segmentation effect at the pixel level can be achieved. However, when the existing image segmentation method performs scene segmentation processing, it is necessary to perform scene segmentation processing on all the contents of the frame image, the amount of data processing is large, and the processing efficiency is low; in addition, the existing image...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04N21/44H04N21/431H04N21/2187G06T7/11G06T7/194G06T7/246
CPCG06T7/11G06T7/194G06T7/248G06T2207/10016G06T2207/10024G06T2207/20021H04N21/2187H04N21/4312H04N21/44H04N21/44012
Inventor 赵鑫邱学侃颜水成
Owner BEIJING QIHOO TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products