Target object positioning virtual-real fusion method and device

A virtual-real fusion and target object technology, which is applied in image data processing, television, instruments, etc., can solve the problem of not being able to effectively reflect the actual state of the target object

Active Publication Date: 2020-05-26
武汉市政环境工程建设有限公司 +2
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the technical problem that the current position of the target can be seen in the BIM three-dimensional model in the above-mentioned prior art, but the actual state of the target cannot be effectively reflected, the present invention provides a virtual-real fusion method for target positioning, through which all The surveillance video of the target can be displayed and displayed, realizing the two-way mapping of the target between the physical entity corresponding to the surveillance video and the digital space corresponding to the BIM 3D model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target object positioning virtual-real fusion method and device
  • Target object positioning virtual-real fusion method and device
  • Target object positioning virtual-real fusion method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The technical solution of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0031] The present invention provides a virtual reality fusion method for target positioning, which is used to obtain the actual state of the target on the construction site, such as figure 1 As shown, the method includes the following steps: information acquisition step, deploying several monitoring cameras on the construction site or utilizing several existing monitoring cameras on the construction site and performing BIM modeling on the construction site to obtain a BIM three-dimensional model of the construction site, obtaining the BIM three-dimensional model in the BIM three-dimensional model The position and view direction of each camera in the world coordinate system in the BIM 3D model; and obtain the camera coordinate system of each camera and the coordinates of the target in the world coordinate system; according to the position of ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a target object positioning virtual-real fusion method and device. The method comprises the following steps: acquiring coordinates of a target object in a world coordinate system and a position and a viewing angle direction of each camera in the world coordinate system; a world matrix and an inverse matrix thereof are introduced; acquiring coordinates of the target object ina camera coordinate system of each camera through coordinate conversion; judging whether the coordinate in the camera coordinate system is within the three-dimensional vision field range of the BIM three-dimensional model of each camera or not; obtaining a display monitoring state of the target object by calling a monitoring video of a camera in the vision field range; according to the invention,bidirectional mapping of the target object between the physical entity (the target object) corresponding to the monitoring video and the digital space corresponding to the BIM three-dimensional modelis realized, and retrieval and playing of the monitoring video containing the target object are realized, so that the actual working state of the target object is obtained.

Description

technical field [0001] The invention relates to the technical field of digital management of construction engineering, in particular to a method and device for locating a virtual reality fusion of a target object. Background technique [0002] Building Information Modeling (Building Information Modeling, BIM model, or BIM 3D model) is a new tool in architecture, engineering and civil engineering. The 3D visualization of objects based on BIM 3D model is widely used in current construction site management. Through GIS, Beidou GNSS, Bluetooth, wifi ranging and other positioning technologies, or video intelligent analysis technology, the coordinate information of the target object in the BIM 3D model is obtained and displayed in 3D visualization. [0003] The main disadvantage of the existing technology is that it can only realize the one-way mapping from the real physical space to the three-dimensional digital virtual space represented by BIM, that is, the position of the curre...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/00G06T17/05G06T7/70G06F16/78H04N13/243H04N13/275H04N13/296H04N7/18
CPCG06T19/006G06T17/05G06T7/70G06F16/78H04N13/243H04N13/275H04N13/296H04N7/181G06T2207/10016G06T2207/30196G06T2207/30232
Inventor 何勇谢瑜钟义明姚继锋
Owner 武汉市政环境工程建设有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products