Unlock instant, AI-driven research and patent intelligence for your innovation.

Local image area characteristic extraction method based on scale prediction

A technology of local area features and extraction methods, applied in computer parts, instruments, characters and pattern recognition, etc., can solve the problem that the SURF feature extraction method does not have affine invariance, etc. Effect

Active Publication Date: 2016-12-21
HARBIN INST OF TECH
View PDF2 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention solves the problem that the existing SURF feature extraction method does not have affine invariance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Local image area characteristic extraction method based on scale prediction
  • Local image area characteristic extraction method based on scale prediction
  • Local image area characteristic extraction method based on scale prediction

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0036] Specific implementation mode one: combine figure 1 To describe this embodiment,

[0037] A method for feature extraction of image local regions based on scale prediction, comprising the following steps:

[0038] Step 1: According to the position of the probe relative to the predetermined landing point during the planetary landing process, the attitude of the probe body coordinate system relative to the planetary surface image taken in orbit, the focal length of the camera, the field of view and other information, the currently captured image is The position of the surface of the target celestial body is initially estimated, and the search range in the global feature library is selected;

[0039] Step 2: According to the pose information of the detector and the feature scale of the corresponding feature point in the global feature library, predict the feature scale of the feature in the captured image;

[0040] Step 3: Predict the rotation angle of the feature in the c...

specific Embodiment approach 2

[0044] The specific implementation process of step 2 described in this embodiment is as follows:

[0045] The distance between the detector and the feature is d when the global feature library is established to capture the image 1 , the focal length of the camera used to capture images when establishing a global feature library is f 1 , the angle between the line between the optical center of the camera and the feature point and the optical axis is α when the image is captured in the global feature library 1 , let the feature scale of a feature in the global feature library be σ 1 , then this feature will produce a scaling transformation in the descending image, and its scale transformation is

[0046] σ 2 = d 1 f 2 cosα 1 d ...

specific Embodiment approach 3

[0051] The specific implementation process of step 3 described in this embodiment is as follows:

[0052] A certain feature in the global feature library will generate a rotation transformation in the descending segment image, and the rotation angle is

[0053]

[0054] Other steps and parameters are the same as those in Embodiment 1 or Embodiment 2.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the planet landing image processing technology field, particularly relates to a local image area characteristic extraction method based on scale prediction and solves a problem that an SURF characteristic extraction method has no affine invariance. According to the method, the position of a presently-shot image on a surface of a target celestial body is preliminarily predicted, and a searching scope of a global characteristic database is selected; according to the pose information of a detector during shooting and a characteristic scale and the direction information of a corresponding characteristic point in the global characteristic database, the characteristic scale and a rotation angle of the characteristic in the shot image are predicted; the shot image is made to rotate according to the predicted rotation angle to acquire a rotation image, according to a quadratic form maximization method and the predicted characteristic scale, a characteristic detection template is generated, convolution of the characteristic detection template and the rotation image is carried out, non-maximization inhibition is carried out in the position space, and thereby characteristic extraction is realized. The method is applicable to processing on autonomous navigation images during planet landing.

Description

technical field [0001] The invention relates to the technical field of image processing in autonomous navigation of planetary landing, in particular to a feature extraction method based on scale prediction. Background technique [0002] In the design of autonomous navigation missions for planetary landings, it is necessary to establish a two-dimensional image database with a fixed planetary coordinate system from the observation images of the early on-orbit observers and high-definition images collected by planetary vehicles. The visual information is extracted from the images taken by the spaceborne camera and matched with the database to obtain its own absolute pose information. Therefore, efficient and accurate recognition of visual information in images is an important prerequisite for planetary autonomous navigation missions. Since there are differences in illumination, shooting angle, and scale between the established database and the images taken by planetary landings...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/46G06K9/62
CPCG06V10/44G06F18/253
Inventor 田阳崔祜涛余萌徐田来
Owner HARBIN INST OF TECH