Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Real-time Video Stitching Method Based on Improved Surf Algorithm

A real-time video and algorithm technology, applied in the field of video splicing, can solve problems such as speeding up the splicing speed, and achieve the effect of real-time splicing

Inactive Publication Date: 2019-07-23
HOHAI UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Zhang Chaowei et al [Refer to the literature: Zhang Chaowei, Zhou Yan, Wang Yaokang, Cheng Yan. Video stitching method based on SIFT feature tracking and matching [J]. Computer Engineering and Application. 2008,44(10):169-172] through the adjacent The inter-frame feature motion estimation obtains the matching features of subsequent frames, which speeds up the splicing speed, but it can only reach 5 frames per second

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Real-time Video Stitching Method Based on Improved Surf Algorithm
  • A Real-time Video Stitching Method Based on Improved Surf Algorithm
  • A Real-time Video Stitching Method Based on Improved Surf Algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The invention realizes a real-time video splicing method aiming at the strict requirement on time of real-time video splicing and some deficiencies in existing video splicing methods. This method first uses the phase correlation coefficient to obtain the overlapping range of the two cameras, and then only the images within the overlapping range will be processed; then the SURF algorithm is improved to improve the generation method of the feature point descriptor in the SURF algorithm and reduce the description. Using the improved SURF algorithm to extract the features in the overlapping area of ​​the video frame; using the feature registration method of block matching to realize the registration of the video frame, obtain the transformation matrix, and finally realize it according to the correlation coefficient between adjacent frames The real-time update of the projection matrix realizes the splicing of videos. In order to facilitate the public's understanding, the tec...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a real-time video splicing method, which belongs to the technical field of video image processing. The real-time video splicing method of the present invention uses the phase correlation method to calculate the overlapping area of ​​the video frame images to be spliced. The SURF (Speeded UP Robust Features) algorithm is improved, simplifying the generation process of feature point descriptors in the SURF algorithm and reducing the descriptor dimension. The improved SURF algorithm is used to extract feature points in the overlapping areas of video frames; a method is proposed The image registration method based on feature block matching matches feature points, reduces the amount of calculation, improves calculation efficiency, and uses feature block matching to quickly obtain the image transformation model; finally, a projection matrix based on correlation coefficients is proposed Update method, used to update the projection matrix to prevent erroneous splicing results. Finally, the video frame images are fused to achieve real-time video splicing.

Description

technical field [0001] The invention relates to a video splicing method, which splices video streams from two cameras in real time, and belongs to the technical field of image and video processing. Background technique [0002] Video stitching is to splice multiple videos that partially overlap with each other into a wide-view, high-resolution panoramic video. It is developed on the basis of image stitching and is the application and extension of image stitching technology. Video stitching technology is widely used in intelligent monitoring, video conferencing, medical microscopic video, aerial video and other fields. [0003] Video stitching is the stitching of video streams captured by multiple cameras into a panoramic video. Different from image stitching, video stitching has relatively high time requirements, so the existing video stitching methods take time complexity into account, but the real-time performance is relatively poor. Developed by Zheng M et al. A camera...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T3/00
CPCG06T3/14
Inventor 吴学文原帅周燕刘娜
Owner HOHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products