Video fusion method, device and equipment and storage medium

A video fusion and video technology, applied in the field of data processing, can solve problems such as low accuracy, large deviation of video and three-dimensional spatial information fusion, and inability to determine the coordinates of moving targets, and achieve the effect of improving efficiency.

Pending Publication Date: 2021-03-12
丰图科技(深圳)有限公司
View PDF0 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] At present, the usual positioning algorithm research is based on the two-dimensional or three-dimensional spatial information platform without video fusion, but in practical applications, not only the coordinates of the moving target need to be determined, but also the specific space of the moving target needs to be known for positioning and tracking. Based on a 2D or 3D spatial information platform without video fusion, the coordinates of moving objects cannot be determined, resulting in large deviations in the fusion of video and 3D spatial information, and very low accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video fusion method, device and equipment and storage medium
  • Video fusion method, device and equipment and storage medium
  • Video fusion method, device and equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0080] The following will clearly and completely describe the technical solutions in the embodiments of the application with reference to the drawings in the embodiments of the application. Apparently, the described embodiments are only some of the embodiments of the application, not all of them. Based on the embodiments in this application, all other embodiments obtained by those skilled in the art without making creative efforts belong to the scope of protection of this application.

[0081] see figure 1 , figure 1 It is a schematic flowchart of a video fusion method provided by an embodiment of the present application. The subject of execution of the video fusion method may be the video fusion device provided by the embodiment of the present application, or a device integrating the video fusion device, such as a terminal or server, etc., the device may be equipped with a camera head and an IMU (Inertialmeasurement unit, inertial measurement module) smartphones, tablets, P...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses a video fusion method, device and equipment and a computer readable storage medium. According to the embodiment of the invention, the method comprises the steps: obtaining video data, and obtaining a photographing parameter corresponding to the video data; acquiring a motion track and a motion posture corresponding to the video data according to the shooting parameters; obtaining a three-dimensional space scene, and loading the video data in the three-dimensional space scene according to the motion trail and the motion posture to obtain an initial coordinate corresponding to the video data; calculating projection coordinates of the video data in the three-dimensional space scene according to the shooting parameters and the video data; adjusting theinitial coordinates according to the projection coordinates to obtain adjusted coordinates; and fusing the video data with the three-dimensional space scene according to the adjusted coordinates. Themotion video and the three-dimensional scene are fused, and the accuracy and efficiency of obtaining the specific position in the video data are improved.

Description

technical field [0001] The present application relates to the technical field of data processing, and in particular to a video fusion method, device, equipment and storage medium. Background technique [0002] At present, the usual positioning algorithm research is based on the two-dimensional or three-dimensional spatial information platform without video fusion, but in practical applications, not only the coordinates of the moving target need to be determined, but also the specific space of the moving target needs to be known for positioning and tracking. Based on a 2D or 3D spatial information platform without video fusion, the coordinates of moving objects cannot be determined, resulting in large deviations in the fusion of video and 3D spatial information, and very low accuracy. Contents of the invention [0003] Embodiments of the present application provide a video fusion method, device, device, and storage medium, which can determine the coordinates of a moving tar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/80G06T17/05
CPCG06T17/05G06T2207/10016G06T2207/20221G06T7/73G06T7/80
Inventor 罗炜孙玉权
Owner 丰图科技(深圳)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products