Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Video Stabilization Method Based on Local and Global Motion Disparity Compensation

A technology of difference compensation and video stabilization, which is applied to TVs, color TVs, and components of color TVs, etc., and can solve the problems of low stabilization rate and high content loss.

Active Publication Date: 2019-02-19
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to propose a video stabilization method based on local and overall motion difference compensation for the problems of low stabilization rate and excessive content loss in the existing video stabilization technology

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Video Stabilization Method Based on Local and Global Motion Disparity Compensation
  • A Video Stabilization Method Based on Local and Global Motion Disparity Compensation
  • A Video Stabilization Method Based on Local and Global Motion Disparity Compensation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0095] Attached below figure 1 The embodiment of the method of the present invention is described in detail.

[0096] A video stabilization method based on local and overall motion difference compensation, the specific steps are as described in step 1 to step 11 in the main body of the invention;

[0097] Wherein, in step 1, the detection of the feature point p is realized by the feature point detection function GoodFeaturesToTrack() function in the computer vision library OpenCV function library, for the feature point The feature point tracking function calcOpticalFlowPyrLK () function in the computer vision library OpenCV function library is adopted to calculate; the value of video frame quantity N is 387 in the present embodiment;

[0098] In step 2, this embodiment divides the video frame t (the value range of t is 1 to 387) into 8*8=64 grids, that is, the value of M is 64, and the value of D is 8; there are 8 grids in one row Grid, then there are 9 grid corners in a ro...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A video stabilization method based on local and overall motion difference compensation, comprising the following steps: 1) using the optical flow method to obtain the motion trajectories of feature points in adjacent video frames, gridding the video frames, and preserving constraints and similarities according to the content Invariant constraints, calculate the camera path of each grid and the overall video frame; 2) Calculate the compensation matrix between the overall camera path and the grid camera path, and calculate an optimized overall camera according to the smooth and overlapping constraints of the path 3) Calculate the optimized grid camera path according to the compensation matrix between the optimized overall camera path and the grid camera path; 4) Calculate the deformation matrix of each grid according to the grid camera path before and after optimization , and deform the mesh to obtain a stable video frame. Compared with the existing methods, this method uses the compensation matrix to reduce the number of paths to be optimized from the number of grids to a whole path, which reduces the calculation time and improves the calculation efficiency.

Description

technical field [0001] The invention relates to a video stabilization method, in particular to a video stabilization method based on local and overall motion difference compensation, and belongs to the technical fields of video signal processing and video motion compensation. Background technique [0002] With the popularity of digital cameras, people can shoot videos anytime and anywhere, which has led to a rapid and steady increase in the number of videos. However, different shooting environments and techniques produce quite different video effects. Especially for amateur video shooters, it is difficult to hold the camera steadily to shoot all the time, so the video will have obvious jitter, and sometimes it looks blurry. At present, some hardware devices have been used to eliminate video shaking during shooting, such as trolleys, cams, tripods, etc., but these devices need to be prepared in advance. If you shoot videos temporarily, you don’t have these devices. Therefor...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04N5/232
CPCH04N23/683H04N23/68
Inventor 黄华黄建峰张磊
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products