Unlock instant, AI-driven research and patent intelligence for your innovation.

Image matching method and video processing method

A matching method and image technology, applied in the field of image processing, can solve the problems affecting the robustness and accuracy of matching, high computational complexity, noise sensitivity, etc., to reduce the probability of local optimality, low algorithm cost, and computational time consumption. less effect

Active Publication Date: 2019-06-07
SPREADTRUM COMM (TIANJIN) INC
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The pixel method uses the relationship between pixel gray values ​​for motion estimation, but it is sensitive to noise and requires rich image information
[0007] The block matching method performs motion estimation on the pixels in the block as a whole, so it is more robust than the pixel method, but the accuracy and computational complexity of the algorithm are greatly affected by the number, size, search range and search strategy of the block
[0008] The phase correlation method estimates the direction and speed of motion by calculating the cross-power spectrum of adjacent frames. It has strong noise resistance, but has a large computational complexity and is susceptible to local motion interference.
[0009] The feature matching method is based on human visual characteristics, and estimates the global motion parameters of the camera by extracting and matching the features of adjacent frames. Compared with other algorithms, it is closer to the process of processing motion information by the human visual system, but when the scene When there are other moving targets in the scene, different motion parameters will appear in the scene, and the feature point extraction may only be limited to the area of ​​a certain motion parameter. At this time, the result will be limited by the feature extraction, which will affect the robustness and accuracy of matching.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image matching method and video processing method
  • Image matching method and video processing method
  • Image matching method and video processing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0058] An image matching method such as figure 1 As shown, it includes the following steps:

[0059]Step S100, perform region division on the image scene to obtain sub-regions.

[0060] The image scene refers to the image content that can be captured by an image capture device such as a camera within a field of view. In this embodiment, performing region division on the image scene refers to dividing the field of view of the image capture device.

[0061] refer to figure 2 , the field of view 1 of the image capture device is the field of view for horizontal shooting, and the image content in the set field of view 1 includes people, animals, and static backgrounds, wherein the area corresponding to the person is shaded with oblique lines, and the area corresponding to animals is shaded with dotted lines , the area corresponding to the static background has no shadow.

[0062] In this step, there are several ways to divide:

[0063] The first division method:

[0064] The...

Embodiment 2

[0134] An image matching method such as Figure 9 As shown, it includes the following steps:

[0135] Step S200, extracting and matching feature points in the first image and the second image to obtain matching feature points between the first image and the second image.

[0136]For a specific implementation manner of this step, reference may be made to step S101 in Embodiment 1.

[0137] Step S201, performing region division on the image scene distributed with the matching feature points to obtain sub-regions.

[0138] Different from the scene division in Embodiment 1, the object of division in this embodiment is the image scene in which the matching feature points are distributed.

[0139] still refer to figure 2 , and combined with Figure 7 , the range of the image scene distributed with the matching feature points is set based on the spatial position relationship of the image scene distributed with the matching feature points.

[0140] The image scene in which the m...

Embodiment 3

[0158] A video processing method, such as Figure 16 shown, which includes:

[0159] Step S300, fitting a matching model between adjacent frame images.

[0160] For the process of fitting the matching model between adjacent frame images in this step, reference may be made to the matching method described in any one of Embodiment 1 or Embodiment 2.

[0161] Step S301, performing motion compensation on frame images based on the matching model.

[0162] In this embodiment, motion compensation is used to implement video deshaking. For the matching model obtained by fitting in step S300, the video sequence (image sequence) may be filtered and fitted based on the matching model to obtain a stable video stream. Then align the reference frame to the current frame, calculate the difference between the reference frame and the current frame, and fill the current frame with the reference frame, so as to realize the motion compensation.

[0163] In addition, multiple reference frames c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an image matching method and a video processing method. The image matching method comprises the steps: carrying out the regional division of an image scene, so as to obtain sub-regions; extracting feature points in a first image and a second image, and carrying out the matching of the feature points, so as to obtain the matched feature points between the first and second images; grouping the matched feature points according to the sub-regions; selecting a matching feature point subset from the matched feature points according to a matching model between the first image and the second image, wherein the matching feature point subset refers to groups as many as possible; and carrying out the fitting of the matching model through employing the matching feature point subset, so as to obtain an image matching result. The method can improve the matching robustness in a jitter removing process of a video.

Description

technical field [0001] The invention relates to the field of image processing, in particular to an image matching method and a video processing method. Background technique [0002] In an actual photography system, the video obtained by a mobile platform such as a vehicle, a handheld or an aircraft includes not only the active motion of the imaging system, but also the random motion of the mobile platform. The unstable video generated by the random motion will make people feel tired and make it difficult to extract useful information. Therefore, how to convert unstable video into stable video is of great significance. [0003] Video deshaking, also called video stabilization, is a very important video processing technology. This video processing technology is designed to eliminate video jitter, to ensure clear and stable video images, so that videos can be better compressed, thereby improving video quality and speed. [0004] Video jitter refers to the shaking and blurrin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/33H04N5/21
Inventor 孟春芝王浩蔡进
Owner SPREADTRUM COMM (TIANJIN) INC