Space-time consistence and feature center EMD adaptive video-stable joint optimization method

A joint optimization and video stabilization technology, applied in color TV components, TV system components, TVs, etc., to suppress jitter components and protect motion trends

Active Publication Date: 2018-11-06
BEIHANG UNIV
View PDF3 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The technical problem solved by the present invention is: to overcome the stability and robustness problems of the existing video stabilization methods, to provide a joint optimization method of temporal-spatial consistency and feature center EMD adaptive video stabilization, which automatically Adaptively handles video stabilization, image saliency protection, parallax reduction, adaptive smoothing, crop area reduction, and video completion, improving the stability, generality, accuracy, and adaptability of video enhancement processing, improving video integrity sex

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Space-time consistence and feature center EMD adaptive video-stable joint optimization method
  • Space-time consistence and feature center EMD adaptive video-stable joint optimization method
  • Space-time consistence and feature center EMD adaptive video-stable joint optimization method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] The present invention will be described in detail below in conjunction with the accompanying drawings and embodiments.

[0056] like figure 1 Shown, the steps of the present invention are:

[0057] (1) Extract image features by SIFT method, perform feature matching, and obtain saliency vectors, and use this as a benchmark to perform spatially consistent deformation.

[0058] (2) Starting from the position of the viewpoint, deform each image frame obtained in step (1), reacquire the feature set, construct a SIFT-based spatial structure matrix, extract rotation, translation, and scaling information, and construct the original motion signal, according to the adaptive The intrinsic modulus function optimization algorithm obtains new motion signals.

[0059] (3) The adaptive motion signal calculated in step (2) is used as a new input signal. According to the feature center algorithm, the motion trend of the original signal is protected as much as possible while the jitter ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a space-time consistence and feature center EMD adaptive video-stable joint optimization method. According to the method, spatial structure consistence matrix estimation-based technologies such as significance protection, inspection elimination, adaptive smoothness, cutting area reduction and video complementation are utilized for jitter videos by taking video jitter prevention as a target on the basis of decomposing noise signals by utilizing an EMD method, so that the stability, universality, correctness and adaptiveness of video enhancement are improved and the videointegrity is improved.

Description

technical field [0001] The invention relates to a joint optimization method of temporal-spatial consistency and feature center EMD (empirical mode decomposition, full name EmpiricalMode Decomposition) adaptive video stabilization, which belongs to the technical field of computer vision enhancement. Background technique [0002] Handheld devices such as mobile phones, camcorders, tablets, and general purpose cameras have become fashionable for amateurs, but because of the simplistic stabilization of the devices, the video captured by these handheld devices is often wobbly and unsightly. Very uncomfortable. Video stabilization technology is designed to remove the jitters and vibrations between frames that are visible in shaky videos. It is one of the most active research topics in the field of computer vision, and can be applied to many high-level video enhancement applications, such as human observation, video recognition, video detection, video tracking, video compression, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/00G06T3/40G06K9/46H04N5/232
CPCG06T3/4007G06T5/005G06V10/462H04N23/682
Inventor 郝爱民李晓李帅秦洪
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products