Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method for automatically generating motion data based on video motion estimation

A motion estimation and automatic generation technology, applied in the field of dynamic data, can solve the problem of high experience requirements of staff, achieve the effect of solving high experience requirements, ensuring accurate matching, and convenient and fast calculation process

Active Publication Date: 2020-04-03
BEIJING XBURN TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of this, the present invention provides a method for automatically generating dynamic data based on video motion estimation, which solves the problem of high requirements for staff experience in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for automatically generating motion data based on video motion estimation
  • A method for automatically generating motion data based on video motion estimation
  • A method for automatically generating motion data based on video motion estimation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 2

[0084] figure 2 It is a flow chart of the method for automatically generating dynamic data based on video motion estimation in embodiment 1, as figure 2 As described, the method for automatically generating dynamic data based on video motion estimation in this embodiment includes:

[0085] S21: Load the video image, and assign codes to the frames in the video image according to the playback order.

[0086] Specifically, the loaded video image can be any video image, including a complete movie video or a part of a movie video, it can be a black and white video or a color video, it can be a silent video, and the video image is composed of frames. For example, there are 24 frame pictures per second, any frame picture has a picture motion vector, and the shape of each frame picture is the same.

[0087] S22: Acquiring the first frame picture and the second frame picture.

[0088] Specifically, any frame in the video image is acquired as the first frame, and any frame in the v...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for automatically generating dynamic data based on video motion estimation. The method comprises the following steps: loading a video image, endowing a frame in the video image with a code according to the play order, and acquiring a first frame and a second frame to obtain a sprite; acquiring an original sprite, acquiring a contrast sprite from the second frame, establishing a plane coordinate system on the frame and substracting coordinate figures in the contrast sprite by coordinate figures in the original sprite to obtain a motion vector of the original sprite; acquiring a motion vector set and clustering to obtain at least one cluster; acquiring a cluster containing most motion vectors as a target cluster, acquiring an average value of the motion vectors in the target cluster to serve as a frame motion vector in the first frame; acquiring frame motion vectors of all the frames and calculating dynamic data of the video image. By adopting the methodprovided by the invention, the experience requirements on workers can be reduced.

Description

technical field [0001] The present invention relates to the field of dynamic data, and more particularly, relates to a method for automatically generating dynamic data based on video motion estimation. Background technique [0002] In modern society, watching movies is one of the main entertainment methods for people. With the development of 3D movie technology, dynamic seats that can make corresponding actions as the movie plot changes are gradually known to people. [0003] A dynamic seat is a seat that can perform specific actions. People use the preset dynamic data to control the dynamic seat to make corresponding actions according to the changes in the movie plot. The dynamic data is used to control the motion parameters of the dynamic seat. . By controlling the dynamic seat to make actions such as vibration, shaking or movement that change with the movie video, it will bring the audience a motion experience other than the audio-visual experience, making the audience i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04N5/14H04N19/51H04N19/513H04N19/597
CPCH04N5/145H04N19/51H04N19/513H04N19/597
Inventor 毕先春崔天龙
Owner BEIJING XBURN TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products