Adaptive action recognition method based on multi-view and multi-mode characteristics

An action recognition and multi-view technology, applied in the field of computer vision and pattern recognition, can solve the problems of not being able to fully capture the historical change process of target movement, performance degradation, and unstable recognition performance, so as to improve performance, performance and stability , the effect of improving accuracy

Inactive Publication Date: 2013-12-25
TIANJIN UNIVERSITY OF TECHNOLOGY
View PDF3 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to solve the problem that the recognition performance of the action recognition method based on visible light is unstable, and when the light changes greatly, for example, at night, its performance will drop sharply. , cannot fully capture th

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adaptive action recognition method based on multi-view and multi-mode characteristics
  • Adaptive action recognition method based on multi-view and multi-mode characteristics
  • Adaptive action recognition method based on multi-view and multi-mode characteristics

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0032] Example 1

[0033] Such as figure 1 Shown is the operation flowchart of the adaptive action recognition method based on multi-view and multi-modal features of the present invention. The operation steps of the method include:

[0034] Step 01 video preprocessing

[0035] Filter and denoise the input depth image and RGB image sequence. At the same time, through the infrared device of the Kinect device, the approximate distance between the target and the camera can be measured. According to the distance value, add 0.5 meters to obtain a large threshold, and subtract 1 meter to obtain The small threshold, for example, in this embodiment, the distance between the target and the camera is about 2 meters, the large threshold is 2.5 meters, and the small threshold is 1 meter. When the depth value of a pixel is greater than the large threshold or less than the small threshold, the pixel is marked as 0, otherwise it is marked as 1, so that the interference of the background to the t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an adaptive action recognition method based on multi-view and multi-mode characteristics. The adaptive action recognition method specifically comprises the steps of: preprocessing videos; carrying out multi-view description on a target movement variation change; extracting equal-hierarchical pyramid characteristics; constructing a multi-view depth and RGB (Red Green Blue) model; selecting a multi-view model, deducing and integrating multi-mode characteristic results. According to the adaptive action recognition method, firstly, aiming at the difficulties of illumination variation, shadow and the like usually occurring in a process of recognizing visible image actions, action recognition is carried out on the basis of multi-view and multi-mode characteristics; then aiming at the limitation of the single view, multi-view description in the target movement variation process is provided, and is capable of more completely capturing variation processes of a target in a depth and RGB image sequence; then, the equal-hierarchical pyramid characteristics also have a spatial resolving power and a detail description power, thereby having very good robustness and discrimination property; finally, multi-mode characteristics are adaptively integrated according to the variation change of ambient light, and the performance and the stability of the action recognition method are further improved.

Description

technical field [0001] The invention belongs to the technical field of computer vision and pattern recognition, and designs an adaptive action recognition method based on multi-view and multi-modal features, solves the difficulty of using visible light images for action recognition, and improves the accuracy and robustness of action recognition It can be used for the action recognition of human targets in surveillance video and realize the intelligent management of surveillance video. Background technique [0002] With the development of computer technology and information technology, the demand for video-based human motion analysis is becoming more and more urgent. In systems such as intelligent monitoring, home security, intelligent robots, and athlete auxiliary training, motion analysis plays an increasingly important role. role. However, most of the early human action recognition uses ordinary RGB image sequences for action analysis, which will be interfered by factors ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06T7/20
Inventor 高赞张桦徐光平薛彦兵申晓霞宋健明
Owner TIANJIN UNIVERSITY OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products