Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data fusion and target detection method, device and equipment

A data fusion and target detection technology, applied in the field of environmental perception, can solve the problem of low accuracy of detection results

Pending Publication Date: 2020-10-30
SHANGHAI GOLDWAY INTELLIGENT TRANSPORTATION SYST CO LTD
View PDF0 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Target detection is based on only one aspect of the data, and the accuracy of the detection results is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data fusion and target detection method, device and equipment
  • Data fusion and target detection method, device and equipment
  • Data fusion and target detection method, device and equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0109] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0110] In order to achieve the above purpose, the embodiments of the present invention provide a data fusion and object detection method, device and equipment. The method and device can be applied to various electronic devices, and the details are not limited. Firstly, the data fusion method will be described in detail below.

[0111] figure 1 The first schematic flow chart of the data fusion method provided by the embodiment of the present invention includes...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a data fusion and target detection method, device and equipment, and the method comprises the steps: obtaining an image and point cloud, and respectively determining a corresponding pixel point, projected to the image, of each point in the point cloud as a projection image pixel point; inputting the pixel values of the projection image pixel points into a dimension prediction model; obtaining image features of the target dimension, wherein the target dimension is the dimension adapted to the point cloud; carrying out dimension splicing on the image features and the point cloud; therefore, in the first aspect, the scheme for fusing the image and the point cloud is provided; in the second aspect, the image feature output by the dimension prediction model is the image feature matched with the dimension of the point cloud, the image feature and the point cloud are fused, and the fusion effect is improved.

Description

technical field [0001] The present invention relates to the technical field of environment perception, in particular to a data fusion and target detection method, device and equipment. Background technique [0002] In some scenarios, for example, in the automatic driving scene of the vehicle, in the autonomous mobile scene of the robot, etc., it is usually necessary to perceive the external environment in real time. Perceiving the external environment can be understood as: detecting the three-dimensional coordinates of each target in the external environment. Generally speaking, the point cloud collected by the lidar and the image collected by the camera can be obtained, and the three-dimensional coordinates of the target can be obtained by combining the data of these two aspects. [0003] In related solutions, combining the above two aspects of data usually refers to: performing target detection based on point cloud and image respectively to obtain two target detection resu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04
CPCG06V2201/07G06N3/045G06F18/214G06F18/253
Inventor 张泽瀚张明赵显邝宏武
Owner SHANGHAI GOLDWAY INTELLIGENT TRANSPORTATION SYST CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products