Traffic environment pedestrian multi-dimensional motion feature visual extraction method

A technology of motion features and extraction methods, applied to instruments, character and pattern recognition, computer components, etc., can solve problems such as complex calculations, large amounts of data, and easy local convergence of clustering algorithms

Active Publication Date: 2018-11-16
CENT SOUTH UNIV
View PDF10 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The patent has the following problems: 1. When extracting the motion vector of each pixel, the pixels are not effectively screened, the

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Traffic environment pedestrian multi-dimensional motion feature visual extraction method
  • Traffic environment pedestrian multi-dimensional motion feature visual extraction method
  • Traffic environment pedestrian multi-dimensional motion feature visual extraction method

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0069] The present invention will be further described below in conjunction with the drawings and embodiments.

[0070] Such as figure 1 As shown, a visual extraction method of pedestrian multi-dimensional motion features in a traffic environment includes the following steps:

[0071] Step 1: Build a pedestrian movement database;

[0072] Collect videos of pedestrians' various movement poses and road positions in each shooting direction of the depth camera, where the shooting directions include facing the camera directly in front of the camera, front left, front right, side, back, back left, and back right. There are three directions, the postures include walking, running and standing;

[0073] Step 2: Perform image extraction on the video in the pedestrian motion database, and preprocess the extracted image to obtain the pedestrian detection frame of each frame of image, and then extract the pedestrian detection frame image of the same pedestrian in the continuous image frame;

[0074...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a traffic environment pedestrian multi-dimensional motion feature visual extraction method comprising a step 1 of constructing a pedestrian motion database; a step 2 of extracting a pedestrian detection frame image of the same pedestrian in consecutive image frames; a step 3 of extracting the HOG feature of the same pedestrian motion energy map; a step 4 of constructing a pedestrian motion pose recognition model based on an Elman neural network; a step 5 of determining a pedestrian pose in a current video by using the pedestrian motion pose recognition model based on the Elman neural network; a step 6 of calculating the instantaneous speed sequences of the pedestrian in the X-axis and Y-axis directions to obtain the real-time speed of the pedestrian; and a step 7 ofaccording to a three-dimensional scene in an intersection environment, obtaining the position information of the pedestrian in the image in real time, and obtaining the real-time motion feature of the pedestrian in combination with the pedestrian pose and the real-time speed. The method has high recognition accuracy and good robustness, is convenient to use, and has a good application and promotion space.

Description

technical field [0001] The invention belongs to the field of traffic monitoring, in particular to a method for visually extracting multi-dimensional motion features of pedestrians in a traffic environment. Background technique [0002] In recent years, with the rapid development of science and technology, more and more intelligent methods have been applied to transportation, especially in the field of intelligent driving. Traffic safety is an eternal topic. In collision accidents, collisions between vehicles and pedestrians also account for a large proportion. The timely detection and posture recognition of pedestrians is the key to the current intelligent traffic active protection system. To achieve accurate identification, the most important thing is pedestrian motion feature extraction. [0003] Pedestrian pose recognition includes global feature method and local feature method. The global feature mostly adopts the method of moving historical images, that is, the frame...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V40/20G06V10/50G06F18/214
Inventor 刘辉李燕飞韩宇阳
Owner CENT SOUTH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products