Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Depth image matching method based on ASIFT (Affine Scale-invariant Feature Transform)

A deep image matching and depth technology, applied in image analysis, image data processing, instruments, etc., can solve the problems of narrow adaptability, difficult identification, poor matching accuracy, etc.

Inactive Publication Date: 2014-11-19
SHENZHEN UNIV
View PDF3 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the rough matching method based on geometric features is difficult to recognize the features of regular or symmetrical objects, and its adaptability is narrow.
The matching methods based on texture features, such as the feature extraction method based on Harris corner points and the point feature extraction method based on Scale Invariant Feature Transform (SIFT, Scale Invariant Feature Transform), are not completely affine invariant feature extraction methods. When there is a sufficiently large affine transformation in the two images, the matching method based on texture features cannot detect enough common features in the two images, resulting in poor matching accuracy in the case of large changes in the field of view. to a satisfactory result

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth image matching method based on ASIFT (Affine Scale-invariant Feature Transform)
  • Depth image matching method based on ASIFT (Affine Scale-invariant Feature Transform)
  • Depth image matching method based on ASIFT (Affine Scale-invariant Feature Transform)

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] refer to figure 1 , an ASIFT-based depth image matching method, including:

[0035] S1. Obtain the depth image and grayscale image of the measured object in two fields of view, and respectively obtain the corresponding relationship between the depth image and the grayscale image in the two fields of view. The two fields of view have different viewing angles but overlap field of view of the area;

[0036] S2. Using the ASIFT algorithm to extract feature point pairs of the two grayscale images in the field of view;

[0037] S3. According to the corresponding relationship between the depth image and the grayscale image in the two fields of view, obtain the depth image point pair set corresponding to the grayscale image feature point pair set, and then perform the depth image point set according to the principle of spatial feature invariance of rigid body transformation Filter the set to remove invalid depth image point pairs, so as to obtain an effective depth image poin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a depth image matching method based on ASIFT (Affine Scale-invariant Feature Transform). The method comprises the following steps: acquiring depth images and grayscale images of an object to be detected in two view fields, and respectively obtaining the corresponding relationship between the depth image and the grayscale image inside each of the two view fields; extracting the feature point pair sets of grayscale images inside the two view fields by adopting the ASIFT algorithm; acquiring depth image point pair sets corresponding to the feature point pair sets of grayscale images according to the corresponding relationships between the depth images and the grayscale images inside the two view fields, and then screening the depth image point pair sets according to the principle that the spatial features in rigid transformation are invariable, so as to obtain effective depth image point pair sets; computing an initial value rotation matrix and a translation matrix by adopting the least square method according to the effective depth image point pair sets; performing iteration by adopting the initial value rotation matrix and the translation matrix as iteration initial value of the ICP algorithm, so as to realize the precise matching of the depth images inside the two view fields. The depth image matching method is wide in adaptability and high in matching precision, and can be widely applied to the fields of three-dimensional digital imaging and optical three-dimensional reconstruction.

Description

technical field [0001] The invention relates to the fields of three-dimensional digital imaging and optical three-dimensional reconstruction, in particular to an ASIFT-based depth image matching method. Background technique [0002] 3D Digital Imaging and Modeling (3DIM, 3D Digital Imaging and Modeling) is an emerging interdisciplinary field that has been actively researched internationally in recent years. It is widely used in many aspects such as reverse engineering, cultural relics protection, medical diagnosis, industrial detection and virtual reality. The spatial matching of depth images is a very critical link in 3DIM technology. Due to the limitation of the field of view of the scanning device and the occlusion of the object itself, it is impossible to obtain all the information describing the shape of the object in one scan. Therefore, in order to obtain a complete data model of the measured object, it is necessary to scan the object repeatedly from multiple perspe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T7/00
Inventor 李东田劲东刘春阳
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products