Video analysis-based space positioning method

A technology of spatial positioning and video analysis, applied in photogrammetry/videogrammetry, photo interpretation, use of re-radiation, etc., to avoid complexity

Active Publication Date: 2014-09-10
SHENZHEN INST OF ADVANCED TECH
View PDF6 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention makes full use of the dense monitoring facilities laid out in the city scene, quickly locates the accurate position of the target, solves the problem of precise positioning in the complex urban scene, avoids the complexity of the position description in simple longitude and latitude positioning, and can obtain the monitored target for a period of time movement track within

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video analysis-based space positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0006] In order to make the purpose, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with specific embodiments and accompanying drawings. It should be understood that the specific embodiments described herein are only used to explain the technical solution of the present invention, and should not be construed as limiting the present invention.

[0007] The present invention provides a spatial positioning method based on video analysis, which is used to obtain the movement trajectory of a monitored target within a certain range. The method includes the following steps:

[0008] Step S100, acquiring and storing the coordinate information, parameter information and related attribute information of the camera;

[0009] Step S200, acquiring and storing video data including the target;

[0010] Step S300, searching for video data containing the target;

[0011] Step S400, acquiring a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a video analysis-based space positioning method. The method comprises the following steps: 1, obtaining and storing coordinate information, parameter information and relevant attribute information of a plurality of cameras in a range; 2, obtaining and storing video data containing a target; 3, searching the video data containing the target according to the characteristic information of the target; 4, obtaining and storing the coordinate information of the target at every moment in real time according to the information of the position of the target relative to the cameras, and the position coordinate information, the parameter information and relevant attribute information of the cameras; and 5, obtaining the motion locus of the target according to the coordinate information of the target at every moment. The method fully utilizes the dense monitoring facility laid in the city scene, can rapidly position the accurate position of the target, can obtain the motion locus of the monitored target within a period of time, solves the accurate positioning problem in the complex city scene, and avoids the position description complexity existing in simple latitude and longitude positioning.

Description

【Technical field】 [0001] The invention relates to space positioning technology, in particular to a space positioning method based on video analysis. 【Background technique】 [0002] The spatial positioning methods in the prior art are often based on cameras tracking and positioning targets within their monitoring range. In the case of complex urban scenes and large-scale and fast-moving targets, the above-mentioned spatial positioning methods do not make full use of the dense monitoring equipment of urban scenes, and it is difficult to obtain the position of the target quickly and accurately, and it is difficult to obtain the position of the target within a period of time. motion track. 【Content of invention】 [0003] The present invention aims to solve the problems existing in the above-mentioned prior art, and proposes a spatial positioning method based on video analysis, which is used to obtain the movement track of a target within a certain range. The method includes ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C11/36
CPCG01C11/04G01C11/36
Inventor 修文群张宝运
Owner SHENZHEN INST OF ADVANCED TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products