Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Monitoring video description method based on deep neural network

A deep neural network and surveillance video technology, applied in the fields of computer vision and natural language processing, can solve problems such as how to determine the attention area with less consideration, and achieve the effect of releasing human resources and high applicability

Inactive Publication Date: 2019-11-26
INSPUR ARTIFICIAL INTELLIGENCE RES INST CO LTD SHANDONG CHINA
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method rarely considers how to determine the attention region, regardless of the content in the video frame, the input region is a region block of the same size and shape accepted by the neural network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monitoring video description method based on deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work belong to the protection of the present invention. scope.

[0038] For a given video data set with description, since the video can be regarded as a series of fast-playing pictures, the analysis of the video can be regarded as the analysis of multiple consecutive pictures. For a video, it is extracted at equal short time intervals frame for analysis. This patent proposes a visual attention mechanism based on th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a monitoring video description method based on a deep neural network, and belongs to the technical field of computer vision and natural language processing. The method based onmonitoring video description of deep learning, and double attention including a visual attention mechanism and a language attention mechanism is used. Meanwhile, the model relates to a plurality of modules, and the modules are matched with each other, so that the result is generally stable and the description effect can be further improved.

Description

technical field [0001] The invention relates to computer vision and natural language processing technology, in particular to a monitoring video description method based on a deep neural network. Background technique [0002] In today's society with advanced technology, in order to facilitate people's lives and ensure safety, monitoring covers a wide range of every corner of our lives. Road monitoring helps regulate driving safety, in-store monitoring helps to ensure property safety, and in-examination room monitoring helps regulate examinee behavior, etc. . Faced with such a huge amount of video data, it is impossible to completely browse the video information, understand the video content and give a language description only by manpower. Therefore, how to process and understand these video contents has become a problem to be solved. A few years ago, when we tried to make computers understand surveillance video, the computer could only classify the content of the video or ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/32G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V20/52G06V10/25G06N3/044G06N3/045G06F18/2411
Inventor 尹晓雅李锐于治楼
Owner INSPUR ARTIFICIAL INTELLIGENCE RES INST CO LTD SHANDONG CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products