Narrow-view-field double-camera image fusion method based on large virtual camera

A virtual camera and dual-camera technology, applied in the field of aerospace and aerial photogrammetry, can solve problems such as difficulty in surveying and mapping, poor geometric accuracy, lack of physical imaging models for spliced ​​images, etc.

Inactive Publication Date: 2014-04-02
WUHAN UNIV
View PDF4 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The other is to do image stitching based on point matching of the same name only based on the images in the overlapping area of ​​the dual cameras, regardless of the geometric conditions of the original single-

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Narrow-view-field double-camera image fusion method based on large virtual camera
  • Narrow-view-field double-camera image fusion method based on large virtual camera
  • Narrow-view-field double-camera image fusion method based on large virtual camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The technical solution of the present invention will be described in detail below in conjunction with the drawings and embodiments.

[0045] see image 3 , the embodiment is aimed at the narrow-field-of-view dual-camera image of simultaneous push-broom imaging, the execution steps are as follows, and computer software technology can be used to realize the automatic operation process:

[0046] Step 1. According to the geometric imaging parameters of the two single cameras, the geometric imaging parameters of the large virtual camera are established.

[0047] The invention proposes the concept of a large virtual camera, and calculates the line-of-sight direction of each probe in the body coordinate system.

[0048] First, the satellite body coordinate system and the single camera coordinate system are introduced.

[0049] Satellite body coordinate system o-X b Y b Z b : The origin o is located at the center of mass of the satellite, X b Y b Z b The axes are the ro...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a narrow-view-field double-camera image fusion method based on a large virtual camera. The narrow-view-field double-camera image fusion method comprises the steps of constructing the large virtual camera according to two single cameras, and constructing geometric imaging parameters of the large virtual camera according to geometric imaging parameters of the two single cameras; constructing corresponding geometric imaging models according to the geometric imaging parameters of the two cameras and the large virtual cameras; calculating and outputting a rational polynomial model coefficient corresponding to the large virtual camera; respectively performing indirect-method geometric correction on images of the two single cameras according to a coordinate forward calculation process and a coordinate back calculation process based on the geometric imaging models to obtain two images under an image coordinate system of the large virtual camera, and obtaining a fused image of the large virtual camera. According to the narrow-view-field double-camera image fusion method based on the large virtual camera, the concept of the large virtual camera is ingeniously used; high-precision fusion of the images of the two cameras in a narrow view field is realized, and the rational polynomial model coefficient corresponding to the large virtual camera is supplied; furthermore, the processing procedure is fully automatic, and manual intervention is not needed; the narrow-view-field double-camera image fusion method is applicable to a ground preprocessing procedure.

Description

technical field [0001] The invention belongs to the field of aerospace and aerial photogrammetry, and relates to a narrow-field dual-camera image splicing method based on a large virtual camera when two narrow-field linear array cameras push-broom imaging simultaneously. Background technique [0002] Linear push-broom imaging is the main sensor for obtaining high-resolution optical satellite images at present. In order to improve the spatial resolution of optical images, a telephoto lens is often used; while the telephoto lens narrows the observation field of view; in order to increase the observation field of view, multiple CCDs (charge-coupled device image sensors) are used to splice or multiple cameras simultaneously way of observation. In the case of simultaneous observation by multiple cameras, each camera has an independent optical system and follows its own geometric imaging model, which brings additional work to subsequent geometric processing. [0003] There are t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01C11/04
CPCG01C11/025G01C11/04G06T3/4038
Inventor 金淑英王密
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products