Multi-video real-time panoramic fusion splicing method based on CUDA

A technology of fusion splicing and multi-video, applied in the field of multi-video real-time panoramic fusion splicing based on CUDA, to achieve the effect of remarkable monitoring effect and wide application prospect.

Inactive Publication Date: 2014-08-20
WISESOFT CO LTD +1
View PDF4 Cites 38 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a multi-video real-time panoramic fusion stitching method based on CUDA, which aims to solve the ghost image and image dislocation phenomenon that occur...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-video real-time panoramic fusion splicing method based on CUDA
  • Multi-video real-time panoramic fusion splicing method based on CUDA
  • Multi-video real-time panoramic fusion splicing method based on CUDA

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] The implementation process of the present invention will be described below in conjunction with the accompanying drawings, taking four video sources output by four cameras (Camera1, Camera2, Camera3, Camera4) as an example.

[0039] Such as figure 1 As shown, the implementation process of the present invention is mainly divided into two stages of system initialization and real-time video frame fusion:

[0040] 1. System initialization stage:

[0041] (1) Obtain the first frame image of each channel of video, perform registration operation and cylindrical projection transformation to obtain the overall perspective transformation model;

[0042] (2) Perform perspective transformation processing on the first frame of each video according to the perspective transformation model, and at the same time obtain the perspective transformation mask and the overlap between adjacent video sources (Camera1-2, Camera2-3, Camera3-4) Area mask map;

[0043] (3) use the dynamic progra...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-video real-time panoramic fusion splicing method based on a CUDA. The splicing method includes the step of system initialization and the step of real-time video frame fusion, wherein the step of system initialization is executed at the CPU end of the CUDA, and the step of real-time video frame fusion is executed at the GPU end; according to CUDA-based stream processing modes S21, S22, S23 and S24, four concurrent processed execution streams are set up at the GPU end, and S21, S22, S23 and S24 are deployed to corresponding stream processing sequences. Compared with the prior art, the multi-video and real-time panoramic fusion splicing method has the following advantages that multi-video real-time panoramic videos without ghost images or color brightness difference are achieved, the ultrahigh-resolution overall monitoring effect on large-scale scenes such as an airport, a garden and a square is remarkable, and the application prospect is wide.

Description

technical field [0001] The invention relates to the fields of graphics and image processing and computer vision, in particular to a CUDA-based real-time panorama fusion stitching method for multiple videos. Background technique [0002] The purpose of panoramic video stitching technology is to splicing and fusing real-time video information captured by cameras scattered at different points in the scene into a panoramic real-time video. Some video image acquisition devices, such as fisheye cameras, can contain almost 360-degree panoramic scene information, but the image resolution is limited and the texture details are not rich enough; although some high-definition network cameras can capture images with a resolution of up to tens of millions of pixels, However, limited by the viewing angle of the camera, the accommodated view is smaller. The purpose of panoramic video stitching is to meet the needs of panoramic perspective and high-definition scene information at the same t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04N5/262H04N5/265H04N9/64G06T5/50G06T15/00
Inventor 兰时勇吴岳洲吴健黄飞虎
Owner WISESOFT CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products