Panorama video automatic stitching method based on SURF feature tracking matching

A panoramic video and automatic splicing technology, which is applied in image data processing, instrumentation, computing, etc., can solve the problems of low feature matching efficiency and poor splicing effect, and achieve the effect of improving efficiency, increasing processing speed, and high processing efficiency

Active Publication Date: 2016-07-20
上海贵和软件技术有限公司
View PDF7 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] In view of this, aiming at the problems of low feature matching efficiency and poor splicing effect in video splicing, the present invention discloses a panoramic video automatic splicing method based on SURF feature tracking and matching. SURF feature matching based on bucket mapping, video coordinate transformation ba

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Panorama video automatic stitching method based on SURF feature tracking matching
  • Panorama video automatic stitching method based on SURF feature tracking matching
  • Panorama video automatic stitching method based on SURF feature tracking matching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0069] Exemplary embodiments of the present invention will be described in detail with reference to the following drawings.

[0070] Embodiments of the present invention provide a panoramic video automatic splicing method based on SURF feature tracking and matching, such as figure 1 As shown, first, a brief introduction to the following process of the method is given:

[0071] Step 1: Read the i-th video frame for the two videos to be spliced;

[0072] Step 2: If i=1, that is, the frame is the first frame of the video, proceed to step 3; otherwise, directly proceed to step 5;

[0073] Step 3: Quickly extract SURF feature points for the first frames of the two videos, and generate feature point description vectors;

[0074] Step 4: For the first frames of the two videos, based on hash mapping and bucket storage, search for similar SURF feature vector point pairs to form a similar feature set;

[0075] Step 5: Based on the video coordinate transformation of the projection mod...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a panorama video automatic stitching method based on SURF feature tracking matching. The method comprises the following steps: a first frame distinguishing type processing method, the first frame serving to select a video reference coordinate system, directly performing video fusion on successive frames; in accordance with the first frame of a video to be stitched, extracting SURF feature points, generating feature point description vectors; based on the Hash mapping and a barrel storage method, searching for similar SURF feature vector point symmetries, constituting a similar feature set; using the vector point symmetries in the similar SURF feature set, resolving an optimal data relational degree coordinate system; conducting dynamic weighted summation on coordinate-converted pixel values of the video frames to be stitched, implementing seamless stitching fusion of the video. According to the invention, the method can implements omnibearing, multi-visual angle, and stereoscopic video seamless stitching fusion, and not only overcomes traces, fuzziness and double image brought by image translation, rotation, zooming and radiation transformation, and also increases efficiency and accuracy in conducting image calibration based on feature matching.

Description

technical field [0001] The invention relates to the technical field of computer graphics and imaging, in particular to a panoramic video automatic splicing method based on SURF feature tracking and matching. Background technique [0002] Video images are the main way for humans to obtain visual information. The video surveillance information systems built in important places, rail transit, key areas, sensitive areas and other areas of the country play an important role in political activities, daily police work, and social public security management. played an irreplaceable role. As one of the key points of image research in recent years, video stitching has been proposed by researchers at home and abroad. According to different video image matching methods, video stitching technology can generally be divided into the following two types: [0003] 1. Video stitching technology based on region correlation [0004] The video mosaic technology based on region correlation sta...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T3/40
CPCG06T3/4038G06T2207/20024
Inventor 朱珂许维纲夏冰
Owner 上海贵和软件技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products