Pedestrian real-time detection method in complex environment based on deep learning

A complex environment, deep learning technology, applied in the field of target recognition, can solve problems such as target mixing, dim light, false detection, etc., to achieve the effect of ensuring detection accuracy, improving detection ability, and increasing speed

Active Publication Date: 2020-06-09
SHANGHAI MARITIME UNIVERSITY
View PDF10 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in some complex natural environments, the results of detection tasks that only rely on visible spectrum images or infrared spectrum images are not accurate enough
Pedestrian targets will encounter many challenges in complex environments, such as: smoke, rain, dust, dim light, etc.
RGB color image is rich in spectral information, and can reflect the details of the scene under certain lighting conditions, but it is difficult to detect the target when the visibility is poor; infrared thermal image is a kind of thermal radiation image, and the gray level is determined by the difference between the observed target and the background. The determination of the temperature difference between them usually lacks structural information, therefore, the target is easily mixed with the background, resulting in false detection and missed detection

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian real-time detection method in complex environment based on deep learning
  • Pedestrian real-time detection method in complex environment based on deep learning
  • Pedestrian real-time detection method in complex environment based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0028] The invention provides a deep learning-based real-time detection method for pedestrians in complex environments, which is used to detect pedestrian targets in pedestrian images in complex environments. In the embodiment of the present invention, the hardware configuration that adopts is the server of Intel i7 8700k processor, NVIDIA TITAN XP graphics card, 64GB RAM, and software environment is Ubuntu16.04 system, Darknet framework.

[0029] Such as Fi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a pedestrian real-time detection method in a complex environment based on deep learning. The pedestrian real-time detection method comprises the following steps: S1, establishing a detection model based on a YOLO algorithm; S2, selecting a plurality of pedestrian images in a complex environment in a color hot picture library, establishing a training data set and a test dataset, inputting the training data set into the detection model, and training the detection model; S3, inputting the test data set into the trained detection model, outputting detection results of pedestrian targets in an infrared thermal image and an RGB color image of the test data set, and screening the detection results through a non-maximum suppression method; and S4, comparing with a YOLOv3, YOLO-tiny detection algorithm, and verifying the detection precision and the detection speed.

Description

technical field [0001] The invention belongs to the field of target recognition, in particular to a method for real-time detection of pedestrians in complex environments based on deep learning. Background technique [0002] Pedestrian detection has attracted extensive attention from the computer vision community due to its wide applications in areas such as driver assistance (autonomous driving vehicles), robotics, person re-identification, video surveillance, and pedestrian behavior analysis. At present, compared with traditional methods, deep learning technology has achieved good results in the field of pedestrian detection. However, in some complex natural environments, the results of detection tasks relying only on visible spectrum images or infrared spectrum images are not accurate enough. Pedestrian targets will encounter many challenges in complex environments, such as: smog, rain, dust, dim light, etc. RGB color image is rich in spectral information, and can reflec...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V40/10G06V2201/07G06N3/045Y02T10/40
Inventor 孙丽华周薇娜
Owner SHANGHAI MARITIME UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products