Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fusion method and system of mobile video and geographic scene and electronic equipment

A mobile video and fusion method technology, applied in the field of virtual reality, can solve the problem that drone video cannot achieve precise fusion of geographic scenes, and achieve high precision and practicability.

Pending Publication Date: 2020-08-25
SHENZHEN UNIV
View PDF6 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of the deficiencies in the above-mentioned prior art, the purpose of the present invention is to provide users with a method, system and electronic equipment for the fusion of mobile video and geographical scenes, so as to overcome the inability of UAV video and geographical scenes in the prior art. Precise Fusion Drawbacks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fusion method and system of mobile video and geographic scene and electronic equipment
  • Fusion method and system of mobile video and geographic scene and electronic equipment
  • Fusion method and system of mobile video and geographic scene and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] In order to make the object, technical solution and advantages of the present invention more clear and definite, the present invention will be further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0046] Video geospatial fusion refers to the matching and fusion of one or more camera image sequence videos (such as the video taken by the camera of the drone) and the related geographic scene to generate a new dynamic virtual scene or model to realize the integration of geographical scenes and real-time video. Video geospatial fusion is a branch of virtual reality technology, and it is also a development stage of virtual reality. Due to the ambiguity of the specific position of the image frame in the video and the incomplete perspective when the drone video is fused with the actual g...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a fusion method and system of a mobile video and a geographic scene and electronic equipment. The fusion method comprises the following steps: selecting a video key frame from terrain image data determined according to position information of an unmanned aerial vehicle camera; extracting a matching point of the video key frame and the terrain image; calculating pixel coordinates of video key frame corner points according to the matching point coordinates and a perspective transformation formula, and converting the pixel coordinates into latitude and longitude coordinates; and obtaining a three-dimensional scene model of the mobile video fused to the terrain image by utilizing an interpolation method according to longitude and latitude coordinates corresponding to theangular points of the video key frame. According to the method, the unmanned aerial vehicle video and the geographic scene image are used as the original data, the high-precision matching point of the key frame and the geographic image is obtained according to the ASIFT algorithm, fusion of the dynamic scene of the dynamic scene model of the corresponding video area obtained based on the unmannedaerial vehicle video is achieved through the matching point, and high precision and practicability are achieved.

Description

technical field [0001] The invention relates to the technical field of virtual reality, in particular to a method, system and electronic equipment for fusing mobile video and geographical scenes. Background technique [0002] With the continuous development of video technology, there are more and more applications in different fields, such as emergency disasters, etc. When natural disasters such as earthquakes, landslides, and mudslides occur, drones have low cost, strong mobility, and safety. The advantages of small risk factor are widely used in terrain surveying and mapping, oblique photography, scene detection and other fields. However, the current conventional monitoring system, due to the problems of mass video dispersion, isolation, incomplete perspective, and unclear location, especially the integration of mobile video and geographical scenes, security-oriented video monitoring, and emergency disaster reduction, is the solution to this current situation. derived fro...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06T17/05
CPCG06T17/05G06T2200/04G06V20/13G06V10/751
Inventor 汤圣君赵维淞王伟玺李晓明郭仁忠
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products