Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A visual extraction method of multi-dimensional motion features of pedestrians in traffic environment

A motion feature and extraction method technology, applied in the field of traffic monitoring, can solve the problems of clustering algorithm prone to local convergence, no pixel point screening, large amount of data, etc., to achieve convenient gesture recognition, high recognition accuracy, robustness Good results

Active Publication Date: 2022-02-15
CENT SOUTH UNIV
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The patent has the following problems: 1. When extracting the motion vector of each pixel, the pixels are not effectively screened, the amount of data is large, and the calculation is complicated; 2. The clustering algorithm used in the patent is prone to local convergence

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A visual extraction method of multi-dimensional motion features of pedestrians in traffic environment
  • A visual extraction method of multi-dimensional motion features of pedestrians in traffic environment
  • A visual extraction method of multi-dimensional motion features of pedestrians in traffic environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0069] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0070] Such as figure 1 As shown, a method for visual extraction of multi-dimensional motion features of pedestrians in a traffic environment includes the following steps:

[0071] Step 1: Build pedestrian motion database;

[0072] Collect videos of various motion postures and road positions of pedestrians in each shooting direction of the depth camera. three directions, the postures include walking, running and standing;

[0073] Step 2: Extract images from videos in the pedestrian motion database, and preprocess the extracted images to obtain the pedestrian detection frame of each frame of image, and then extract the pedestrian detection frame images of the same pedestrian in consecutive image frames;

[0074] Step 3: Perform grayscale processing on each pedestrian detection frame image, synthesize the motion energy map of the grayscale image corre...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for visually extracting pedestrian multi-dimensional motion features in a traffic environment, comprising: step 1: building a pedestrian motion database; step 2: extracting pedestrian detection frame images of the same pedestrian in consecutive image frames; step 3: extracting the same pedestrian motion HOG features of the energy map; Step 4: Construct a pedestrian movement and posture recognition model based on Elman neural network; Step 5: Use the pedestrian movement and posture recognition model based on Elman neural network to judge the pedestrian posture in the current video; Step 6: Calculate and obtain pedestrians The real-time speed of the pedestrian is obtained from the instantaneous speed sequence in the direction of the X-axis and the Y-axis; Step 7: According to the three-dimensional scene in the intersection environment, the position information of the pedestrian in the image is obtained in real time, combined with the posture and real-time speed of the pedestrian, and the real-time speed of the pedestrian is obtained. sporty features. This scheme has the characteristics of high identification accuracy and good robustness, and is easy to apply, so it has a good application and promotion space.

Description

technical field [0001] The invention belongs to the field of traffic monitoring, in particular to a method for visually extracting multi-dimensional motion features of pedestrians in a traffic environment. Background technique [0002] In recent years, with the rapid development of science and technology, more and more intelligent methods have been applied to transportation, especially in the field of intelligent driving. Traffic safety is an eternal topic. In collision accidents, collisions between vehicles and pedestrians also account for a large proportion. The timely detection and posture recognition of pedestrians is the key to the current intelligent traffic active protection system. To achieve accurate identification, the most important thing is pedestrian motion feature extraction. [0003] Pedestrian pose recognition includes global feature method and local feature method. The global feature mostly adopts the method of moving historical images, that is, the frame...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V40/20G06V10/50G06V10/774G06V10/82G06N3/04G06N3/08G06K9/62
CPCG06V40/20G06V10/50G06F18/214
Inventor 刘辉李燕飞韩宇阳
Owner CENT SOUTH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products