Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video understanding method

A video and video frame technology, applied in the field of video understanding, can solve the problem of inability to extract dense frame features

Inactive Publication Date: 2018-11-30
NAT COMP NETWORK & INFORMATION SECURITY MANAGEMENT CENT +1
View PDF2 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this feature extraction method is often unable to extract dense frame features

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video understanding method
  • Video understanding method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] Preferred embodiments of the present invention are described below with reference to the accompanying drawings. Those skilled in the art should understand that these embodiments are only used to explain the technical principles of the present invention, and are not intended to limit the protection scope of the present invention.

[0044] See attached figure 1 , figure 1 The main steps of a video understanding method in this embodiment are exemplarily shown. like figure 1 As shown, the video understanding method in this embodiment may include the following steps:

[0045] Step S101: Obtain multiple video frame groups of the target video.

[0046] The video frame group in this embodiment may include two ordered video frames, and each ordered video frame may include a plurality of video frames arranged sequentially in time order.

[0047] Specifically, in this embodiment, the video frame group of the target video can be obtained according to the following steps:

[0...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of computer vision, in particular to a video understanding method which is aimed at solving the technical problems of how to extract the dense frame feature and long-term spatiotemporal feature of a video effectively. For the aim, the video understanding method provided by the invention includes the steps that first, a plurality of video frame groups ofa target video is obtained by residual network; then time sequence relation network is used to generate the time sequence relation feature of the video according to the video frame groups; and finally, the video behavior category of the target video is predicted according to the time sequence relation feature, wherein each video frame group comprises two ordered video frames, each ordered video frame comprises a plurality of video frames arranged in chronological order in sequence. Based on the above steps, the dense frame feature and the long-term spatiotemporal feature of the target video can be effectively obtained so that the video behavior category of the target video can be predicted quickly and accurately.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a video understanding method. Background technique [0002] Compared with static images, videos contain one-dimensional timing information, so videos can carry more motion information. Based on these motion information, actions that may occur in a period of time in the future can be predicted. In the field of computer vision technology, video understanding mainly follows the following steps: feature extraction, classification model learning and behavior classification. Wherein, the step of "feature extraction" mainly includes feature extraction based on artificial design technology and feature extraction based on machine learning technology. [0003] "Feature extraction based on artificial design technology" mainly includes: sampling local video sub-blocks and counting the features of each local video sub-block (ie, local feature extraction). Specifically, the local sp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V20/41G06V20/46
Inventor 李扬曦杜翠兰佟玲玲王晶缪亚男胡卫明王博邓智方张宏源
Owner NAT COMP NETWORK & INFORMATION SECURITY MANAGEMENT CENT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products