Video sequence registering method based on combination of motion information and background information

A video sequence and background information technology, applied in the field of image processing, can solve the problems of non-discriminative trajectory descriptors, errors in video sequence registration results, and inability to complete registration, achieving accurate video sequence registration effects and high reliability. Effect of Discriminative, Accurate Video Sequence Registration Results

Active Publication Date: 2015-01-07
XIDIAN UNIV
View PDF3 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this method is that because this method only considers the information of the trajectory itself and uses the local features in the neighborhood of the trajectory point to describe the trajectory point, when there are multiple moving objects in the scene and the trajectory is similar, or when the trajectory When there are a large number of trajectory segments with similar local characteristics, the obtained trajectory descriptors are not very distinguishable. In these cases, the registration results of the video sequences obtained by this method have large errors or even cannot complete the registration.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video sequence registering method based on combination of motion information and background information
  • Video sequence registering method based on combination of motion information and background information
  • Video sequence registering method based on combination of motion information and background information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The present invention will be further described below in conjunction with the accompanying drawings.

[0043] refer to figure 1 , the concrete steps of the present invention are as follows:

[0044] Step 1: Input the reference video sequence l and the video sequence to be registered l' respectively, and separate the background image and the moving object from the two input video sequences to obtain the reference motion trajectory P and the motion trajectory to be registered P'.

[0045] Step 2, adopt the method of feature point detection and matching to the obtained background image, obtain the feature point matching pair of the background image (X i ,X′ i ), 300≤i≤500.

[0046] Step 3, use the feature points of the obtained background image to match pairs (X i ,X′ i ), calculate the fundamental matrix F between the background images;

[0047] Step 4, matching pairs of feature points from the obtained background image (X i ,X′ i ), select four sets of matching p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a video sequence registering method based on combination of motion information and background information. The method mainly solves the problem that in the prior art, video sequences in different view points cannot be accurately registered. The method includes the steps that (1) separation of background images and motion targets is carried out on two input video sequences; (2) feature point matching pairs of the background images are obtained, and basis matrixes between the background images are calculated; (3) four sets of matching point pairs corresponding to four spatial points not on the same plane in a three-dimensional space are selected from the background images; (4) intersection points of projection lines, in another video sequence, of points in motion tracks and bipolar lines are obtained; (5) motion track points are matched and a point pair set corresponding to candidate time is obtained; (6) a time line is fitted and time transformation parameters are recovered. The time relation between video sequences can be accurately recovered, registering accuracy is improved and the method can be used for registering video sequences with static backgrounds.

Description

technical field [0001] The present invention belongs to the technical field of image processing, and further relates to a video sequence registration method, the object of which is to calibrate video sequences shot from different angles of view or at different times, which can be applied to video sequence registration of static backgrounds. Background technique [0002] Video sequence registration is an important field of image processing. It calculates the spatio-temporal transformation parameters between similar videos of the same scene from different perspectives or at different times obtained by multiple sensors, and then synchronizes these videos in time and in space. geometric calibration. In general, feature-based methods can be used, especially methods characterized by moving object trajectories for video sequence registration. These motion-trajectory-based video sequence registration methods mainly consider the shape of the motion trajectory itself or the polarity b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T7/20
CPCG06V20/48G06V2201/07
Inventor 张强毕菲相朋王亚彬王龙
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products