Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target space positioning method

A technology for spatial positioning and targeting, applied in image analysis, image enhancement, instruments, etc., can solve the problem of insufficient spatial positioning accuracy, and achieve the effect of improving spatial positioning effect, fine target contour, and improving accuracy.

Inactive Publication Date: 2019-12-20
HUAZHONG UNIV OF SCI & TECH
View PDF6 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention provides a target space positioning method, which is used to solve the technical problem that the existing target space positioning method cannot overcome the inherent drawbacks due to the independent use of a single type of method such as geometric method and deep learning method, which leads to insufficient spatial positioning accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target space positioning method
  • Target space positioning method
  • Target space positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0043] A target space positioning method 100, such as figure 1 shown, including:

[0044] Step 110, simultaneously collect two target images with different viewing angles, and obtain a three-dimensional coordinate set of each pixel point in one of the images through binocular vision space positioning;

[0045] Step 120, based on the instance segmentation, perform classification and regression on the target in the image to obtain the target binary mask set;

[0046] Step 130: Based on the three-dimensional coordinate set of each pixel point and the target binary mask set, the three-dimensional coordinates of the target in the image are obtained through pixel point coordinate mapping and fusion, so as to realize the spatial positioning of the target.

[0047] Based on the binocular visual space positioning, the sparse three-dimensional coordinates (the actual distance value relative to the origin) used to describe the real scale and spatial positioning information of the target...

Embodiment 2

[0076] A storage medium, in which instructions are stored, and when a computer reads the instructions, the computer is made to execute any object space positioning method as described in Embodiment 1 above.

[0077] The relevant technical solutions are the same as those in Embodiment 1, and will not be repeated here.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target space positioning method, which comprises the following steps of: simultaneously acquiring two target images with different visual angles, and obtaining a three-dimensional coordinate set of each pixel point in one image through binocular vision space positioning; based on instance segmentation, performing classification regression on a target in the image to obtain a target binary mask set; and based on the three-dimensional coordinate set of each pixel point and the target binary mask set, through pixel point coordinate mapping and fusion, obtaining a targetthree-dimensional coordinate in the image, and realizing target space positioning. Based on binocular vision space positioning, sparse three-dimensional coordinates used for describing the real scaleof a target and space positioning information are obtained. A deep learning method is adopted to perform monocular instance segmentation on a specific category of interested targets, and semantic attributes of pixels are accurately defined. Finally, based on coupling of the three-dimensional coordinates and the instance segmentation result, target space positioning is performed under the connection relationship of the pixel coordinates so that the sparsely dispersed three-dimensional coordinates are enabled to be dense and the positioning accuracy can be enhanced.

Description

technical field [0001] The invention belongs to the field of target space positioning, and more particularly relates to a target space positioning method. Background technique [0002] With the continuous development of production and life, the location information of the target has begun to be concerned by more and more fields. The spatial positioning of the target has a wide range of applications in many application scenarios, such as alarming of dangerous areas in the factory area, automatic driving obstacle prediction, and aerospace Position and attitude estimation, etc. [0003] The existing target space positioning methods mainly include hardware-assisted methods, traditional geometric methods, and depth estimation methods based on deep learning. The hardware-assisted method mainly refers to the calculation and positioning of the target through the analysis and calculation of the transmitted / received active signal with the help of radio frequency identification, parti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/10G06T7/70G01C11/08
CPCG01C11/08G06T2207/20221G06T7/10G06T7/70
Inventor 韩守东夏晨斐陈国荣刘巾英
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products