Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pedestrian local feature big data hybrid extraction method

A local feature and mixed extraction technology, applied in the field of traffic monitoring, can solve the problems of detection local limitations, low accuracy, and difficulty in pedestrian detection

Active Publication Date: 2018-11-16
CENT SOUTH UNIV
View PDF10 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The difficulty of pedestrian detection lies in how to effectively identify pedestrian movements. Human motion has non-rigid body characteristics, and there are obvious differences in shape, speed, appearance, and clothing between different individuals, irregular motion, and different motion directions. Reasons for Difficulty in Pedestrian Detection
Traditional pedestrian detection methods mainly include: microwave method, beam method, and pressure sensor method. These methods generally have limitations in detection locations and low accuracy, and cannot use specific and accurate pedestrian information provided by images.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian local feature big data hybrid extraction method
  • Pedestrian local feature big data hybrid extraction method
  • Pedestrian local feature big data hybrid extraction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0064] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0065] Such as figure 1 As shown, a pedestrian local feature big data hybrid extraction method includes the following steps:

[0066] Step 1: Build pedestrian motion database;

[0067] Collect videos of various motion postures and road positions of pedestrians in each shooting direction of the depth camera. three directions, the postures include walking, running and standing;

[0068] Step 2: Extract images from videos in the pedestrian motion database, and preprocess the extracted images to obtain the pedestrian detection frame of each frame of image, and then extract the pedestrian detection frame images of the same pedestrian in consecutive image frames;

[0069] Step 3: Perform grayscale processing on each pedestrian detection frame image, synthesize the motion energy map of the grayscale image corresponding to the pedestrian detection frame imag...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a pedestrian local feature big data hybrid extraction method comprising a step 1 of constructing a pedestrian motion database; a step 2 of extracting a pedestrian detection frame image of the same pedestrian in continuous image frames; a step 3 of extracting the HOG feature of the same pedestrian motion energy map; a step 4 of constructing a pedestrian motion pose recognition model based on a support vector machine; a step 5 of determining a pedestrian pose in a current video by using the pedestrian motion pose recognition model based on the support vector machine; a step 6 of calculating the instantaneous speed sequences of the pedestrian in the X-axis and Y-axis directions to obtain the real-time speed of the pedestrian; and a step 7 of according to a three-dimensional scene in an intersection environment, obtaining the position information of the pedestrian in the image in real time, and obtaining the real-time motion feature of the pedestrian in combinationwith the pedestrian pose and the real-time speed. The method can obtain more comprehensive and useful information, and has a wide signal detection range, complete target information, and high cost performance and is suitable for promotion.

Description

technical field [0001] The invention belongs to the field of traffic monitoring, and in particular relates to a large data mixed extraction method of pedestrian local characteristics. Background technique [0002] Nowadays, big data is applied in more and more extensive fields, and has had a profound impact on human society. With the opening of the national policy of "Made in China 2025", big data has already had an impact on transportation, industry, medical care, energy, climate and other fields. obvious impact. In the field of transportation, ensuring the safety of human property is always the first priority. People have been making continuous efforts to explore and improve. However, pedestrian traffic accidents are still common. How to effectively protect the safety of pedestrians’ lives and property is a key issue in the field of transportation. The point is also difficult. [0003] The difficulty of pedestrian detection lies in how to effectively identify pedestrian ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V40/20G06V20/52G06V10/507G06F18/2411G06F18/214
Inventor 刘辉李燕飞
Owner CENT SOUTH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products