Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Perspective-independent behavior recognition method based on deep learning network

A deep learning network and recognition method technology, which is applied in the field of view-independent behavior recognition, can solve the problems of insufficient effect and poor algorithm robustness, and achieve the effect of comprehensive features, complete behavior description and good robustness

Active Publication Date: 2020-02-21
青岛圣瑞达科技有限公司
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The behavior analysis method based on spatio-temporal feature points and skeleton information has achieved remarkable results in the traditional single-view or single-person mode, but it is not suitable for areas with relatively large pedestrian traffic such as streets, airports, and stations, or human body occlusion and light changes. The emergence of a series of complex problems such as , viewing angle transformation, etc., the simple use of these two analysis methods often fails to meet people's requirements in real life, and sometimes the robustness of the algorithm is also very poor

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Perspective-independent behavior recognition method based on deep learning network
  • Perspective-independent behavior recognition method based on deep learning network
  • Perspective-independent behavior recognition method based on deep learning network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0045] Such as figure 1 and figure 2 As shown, the view-independent behavior recognition method based on the deep learning network of the present invention includes the training process of using the training sample set to obtain the classifier and the recognition process of using the classifier to identify the test sample;

[0046] The training process is as figure 1 shown, including the following steps:

[0047] S1) Input the video frame images Image 1 to Imag...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention proposes a view-independent behavior recognition method based on a deep learning network, which includes the following steps: input a video frame image under a certain view, and use deep learning to extract and process bottom-level features; Modeling, the cube model is obtained in chronological order; the cube model of all views is converted into a view-invariant cylinder feature space map, and then input into the classifier for training to obtain a video behavior view-independent classifier. The technical solution of the present invention uses a deep learning network to analyze human behavior under multiple perspectives, which improves the robustness of the classification model; it is especially suitable for training and learning based on big data, and can give full play to its advantages.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a view-independent behavior recognition method based on a deep learning network. Background technique [0002] With the rapid development of information technology, computer vision has ushered in the best development period with the emergence of concepts such as VR, AR and artificial intelligence. As the most important video behavior analysis in the field of computer vision, it has also been increasingly recognized by domestic and foreign scholars. favor. In a series of fields such as video surveillance, human-computer interaction, medical care, and video retrieval, video behavior analysis occupies a large proportion. For example, in the popular self-driving car project, video behavior analysis is very challenging. Due to the complexity and diversity of human body movements, coupled with the influence of human body self-occlusion, multi-scale, viewing angle rotation, an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62G06N3/04
CPCG06N3/045G06F18/24G06F18/214
Inventor 王传旭胡国锋刘继超杨建滨孙海峰崔雪红李辉刘云
Owner 青岛圣瑞达科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products