Check patentability & draft patents in minutes with Patsnap Eureka AI!

Matching of Image Feature Points, Parallax Extraction and Depth Information Extraction Methods

An image feature point and matching method technology, applied in the field of image processing, can solve problems such as the inability to meet the real-time requirements of D1 resolution and the large amount of logical operations.

Active Publication Date: 2020-10-30
ALLWINNER TECH CO LTD
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of this, the embodiment of the present invention provides a method for matching image feature points, extracting parallax and extracting depth information to solve the problem that the SAD algorithm is used for image matching in the prior art, which has a large amount of logic operations and cannot satisfy D1 resolution. Technical defects of the above real-time requirements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Matching of Image Feature Points, Parallax Extraction and Depth Information Extraction Methods
  • Matching of Image Feature Points, Parallax Extraction and Depth Information Extraction Methods
  • Matching of Image Feature Points, Parallax Extraction and Depth Information Extraction Methods

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0050] Embodiment 1 of the present invention can be specifically applied to products that require scene modeling, such as positioning of intelligent security detection objects, three-dimensional positioning of drones, automatic driving obstacle judgment, robot navigation, 3D printing, etc. figure 1 It is a flowchart of a method for matching image feature points provided by Embodiment 1 of the present invention. The method of this embodiment specifically includes:

[0051] 110. Image acquisition, the acquired images are left view and right view.

[0052] In this implementation, the left view and the right view specifically refer to two images including the same target object obtained from two different positions, specifically, they may be obtained through two identical or different cameras at different positions The left view and the right view may also be obtained by moving the same camera, which is not limited in this embodiment.

[0053] Wherein, the target object specific...

Embodiment 2

[0114] figure 2 It is a flow chart of a method for matching image feature points provided in Embodiment 2 of the present invention. This embodiment is optimized on the basis of the above embodiments. In this embodiment, the left and right views in step 1 are optimized as follows: from the binocular vision system, the left view is captured by the left camera in the binocular vision system, and the right The view is captured by the right camera in the binocular vision system.

[0115] Further, the optimization further includes: obtaining the internal and external parameters of the binocular vision system in advance, and establishing the camera coordinate system.

[0116] Further, after step 1, the optimization also includes: preprocessing the left view and the right view.

[0117] Further, the preprocessing of the left view and the right view is optimized as: performing SOBEL filtering processing on the left view and the right view.

[0118] Further, the left view search blo...

Embodiment 3

[0165] image 3 It is a flow chart of a disparity extraction method provided by Embodiment 3 of the present invention. The method of this embodiment specifically includes:

[0166] 310. Obtain the internal and external parameters of the camera, and establish the camera coordinate system;

[0167] 320. Obtain matched feature points according to the method in Embodiment 1 or Embodiment 2;

[0168] 330. In the established camera coordinate system, obtain the disparity of the matched feature points in the camera coordinate system.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An embodiment of the present invention discloses an image feature point matching, parallax extraction and depth information extraction method. The method comprises: acquiring an image; setting searchblocks; calculating a HSAD value of a first location; calculating M SAD values corresponding to the first location, and taking the center of a right side view search block corresponding to a minimum value in the M SAD values corresponding to the first location as a matching feature point of the first location; and calculating a HSAD value of the Nth location; and calculating M SAD values corresponding to the Nth location, and taking the center of a right side view search block corresponding to a minimum value in the M SAD values corresponding to the Nth location as a matching feature point ofthe Nth location. According to the technical scheme of the embodiment of the present invention, the calculation amount of the SAD algorithm are greatly reduced, and the calculation speed is improved,so that the SAD algorithm can adapt to the real-time matching requirement of the image above the D1 resolution.

Description

technical field [0001] The embodiments of the present invention relate to the technical field of image processing, and in particular, to a method for matching image feature points, extracting parallax, and extracting depth information. Background technique [0002] With the continuous development of computer technology, stereo vision technology has been widely concerned and applied, and has become a hot research topic. According to the principle of biological visual perception depth information, the stereo vision system uses the camera to obtain the calibration image and the target image, and performs camera calibration on the calibration image through the camera to obtain the internal and external parameters of the camera. According to the position information of the camera and the disparity value of the matched feature points in the target image, the depth information of the spatial point is finally recovered. [0003] For the matching of feature points in the above proce...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/50G06T7/73G06T7/80
CPCG06T7/50G06T7/73G06T7/80
Inventor 刘劲松
Owner ALLWINNER TECH CO LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More