Pedestrian re-identification method and device based on global features and coarse granularity local features

A technology of local features and global features, applied in the field of image processing and identity recognition, to achieve the effect of low computational complexity, realization of pedestrian identity recognition, and improved efficiency

Inactive Publication Date: 2018-03-06
PEKING UNIV
View PDF7 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This technology divides an image by small grains that have stronger connections with each other compared to larger ones. It then analyzes it for both strong visual cues like grayscale or color patterns associated with people's faces instead of just looking at them separately from background noise. By combining these two techniques together, this system can accurately match footprint data without requiring too much computing power. Additionally, there will be more efficient ways to recognize objects based only on their own appearance rather than relying solely on its surroundings.

Problems solved by technology

This patented technical problem addressed by this patents relates to identifying humans from footprinted areas (images) captured through cameras or sensors that capture their movements without obstructive clothing. Current methods involve analyzing large amounts of pixel values representing these feetwear patterns with complicated algorithms requiring significant computing power.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian re-identification method and device based on global features and coarse granularity local features
  • Pedestrian re-identification method and device based on global features and coarse granularity local features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0052] According to an embodiment of the present invention, a pedestrian re-identification method based on global features and coarse-grained local features is provided, such as figure 1 shown, including:

[0053] Step 101: Detect the pedestrian image in the query image, use the detected pedestrian image as the global image, and detect the key points of the human body of the pedestrian, and divide the human body of the pedestrian according to the key points of the human body to obtain the local component area;

[0054] Step 102: Extracting the global feature description of the global image and the local feature description of the local component area, and fusing the extracted global feature description and local feature description to obtain a global-local feature description;

[0055] Step 103: After performing association analysis and combined indexing on each image in the pedestrian database, perform pedestrian retrieval from coarse-grained to fine-grained according to the ...

Embodiment 2

[0084] According to an embodiment of the present invention, a pedestrian re-identification device based on global features and coarse-grained local features is provided, such as figure 2 shown, including:

[0085] The first detection module 201 is used to detect pedestrian images in the query image, and use the detected pedestrian images as global images;

[0086] The second detection module 202 is used to detect the key points of the pedestrian's human body in the query image, and divide the human body of the pedestrian according to the key points of the human body to obtain the local component area;

[0087] An extraction module 203, configured to extract the global feature description of the global image obtained by the first detection module 201, and the local feature description of the local component area obtained by the second detection module 202;

[0088] The fusion module 204 is used to fuse the global feature description and the local feature description extracted b...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention, which belongs to the field of image processing and identification, discloses a pedestrian re-identification method and device based on global features and coarse granularity local features. The method comprises: detecting a pedestrian image in a query image as a global image, detecting a body key point of the pedestrian, and dividing the body of the pedestrian to obtain a local partregion; extracting global feature description of the global image and local feature description of the local part region, carrying out fusion of the global feature description and the local feature description to obtain global-local feature description; and carrying out associated analysis and combined indexing on all images in a pedestrian database, carrying out pedestrian retrieving on the processed images from the coarse granularity to fine granularity according to the global-local feature description, and determining the identity of the pedestrian in the query image. According to the invention, coarse granularity division is carried out on the body in the image, so that the good robustness is realized; and because the global features and regional local features are combined, accuratematching of the pedestrian images and pedestrian identity identification are realized.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Owner PEKING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products