System, method and device for labeling continuous frame data

A continuous frame and frame data technology, applied in the field of continuous frame data labeling system, can solve the negative impact of neural network training, the complexity of data form, labor-intensive and other issues, to improve the labeling speed and accuracy, reduce labeling workload , the effect of reducing the cost of labeling

Active Publication Date: 2021-07-16
MOMENTA SUZHOU TECH CO LTD
View PDF6 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, most of the annotation data is manually annotated, including 2D images, 3D lidar point cloud data, etc., which is a very slow and inefficient process
It requires people to sit in front of a computer screen to operate annotation tools and mark them one by one, which is extremely labor-intensive
For lidar data, due to the complexity and sparseness of its data form, it is easy to label errors or miss labels, and may even have a negative impact on neural network training

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System, method and device for labeling continuous frame data
  • System, method and device for labeling continuous frame data
  • System, method and device for labeling continuous frame data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0068] see figure 1 , figure 1 It is a schematic structural diagram of a labeling system for continuous frame data provided by an embodiment of the present invention. The system can be applied to automatic driving, through which a large amount of labeled data can be generated faster and more efficiently for model training. Such as figure 1 As shown, a continuous frame data labeling system provided in this embodiment specifically includes: a cloud 110 and a labeling terminal 120; wherein,

[0069] The cloud 110 is configured to: obtain a labeling task, the labeling task includes the category, location and output file format of the object to be labeled;

[0070] Among them, the labeling task is used as the prior information of the labeling process, including the object to be labeled (such as vehicles, pedestrians, etc.), the category of the object to be labeled (such as a tricycle, bus or car, etc.), the preset size and the output file of the labeling file format etc. The l...

Embodiment 2

[0086] see figure 2 , figure 2 It is a schematic flowchart of a method for labeling continuous frame data applied to the cloud provided by an embodiment of the present invention. The method of this embodiment can be performed by a device for labeling continuous frame data, which can be implemented by means of software and / or hardware, and can generally be integrated in cloud servers such as Alibaba Cloud and Baidu Cloud, which are not limited by the embodiments of the present invention . Such as figure 2 As shown, the method provided in this embodiment specifically includes:

[0087] 210. Obtain the labeling task.

[0088] Among them, the labeling task includes the category and position of the object to be labeled.

[0089] 220. Read the continuous frame data, and according to the labeling task, perform target detection on each frame of data in the continuous frame data, and use the category and position of the object to be marked in each frame of data obtained as the ...

Embodiment 3

[0094] see image 3 , image 3 It is a schematic flowchart of a labeling method applied to continuous frame data at the labeling end provided by an embodiment of the present invention. The method can be executed by a device for labeling continuous frame data, which can be implemented by means of software and / or hardware, and can generally be integrated into a labeling terminal. Such as image 3 As shown, the method provided in this embodiment specifically includes:

[0095] 310. Obtain the pre-marking result of the continuous frame data sent by the cloud.

[0096] 320. If a correction instruction for the pre-labeled result is received, correct the labeling result according to the correction instruction, and use the corrected labeling result as the target labeling result of the continuous frame data.

[0097] Among them, the pre-labeling result is: after the cloud reads the continuous frame data, according to the labeling task, the target detection result of the object to b...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses a system, method and device for labeling continuous frame data. The system comprises a cloud end and a labeling end, wherein the cloud end reads continuous frame data, and performs target detection on each frame of data in the continuous frame data according to a labeling task to obtain a detection result of a to-be-labeled object in each frame of data; according to the detection result and the time sequence information of each frame of data, an association relationship between the same object to be labeled in each frame of data is established as a pre-labeling result; an extensible pre-labeling file is generated according to the pre-labeling result, and the pre-labeling file and the continuous frame data are sent to the labeling end; and the labeling end receives the continuous frame data sent by the cloud end and the corresponding pre-labeling file, and after a correction instruction for the pre-labeling file is received, the labeling file is corrected according to the correction instruction, and a target labeling result is obtained. By adopting the scheme, the manual time for marking the continuous frame data is shortened, the marking efficiency of the continuous frame data is improved, and the marking cost is reduced.

Description

technical field [0001] The invention relates to the technical field of automatic driving, in particular to a system, method and device for labeling continuous frame data. Background technique [0002] In the field of autonomous driving, the perception module uses the data of various sensors and high-precision map information as input, and after a series of calculations and processing, it can accurately perceive the surrounding environment of the autonomous driving vehicle. At present, the mainstream of automatic driving perception algorithm adopts deep learning method, which needs to use a large number of labeled data sets to train the model, so the ability to generate a large amount of labeled data faster and more efficiently is the key to automatic driving perception. [0003] Currently, most of the labeled data are manually labeled, including 2D images, 3D lidar point cloud data, etc., which is a very slow and inefficient process. It requires people to sit in front of a ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/538G06F16/55G06F16/587G06F16/58
CPCG06F16/5866G06F16/55G06F16/587G06F16/538G06V10/774G06V20/56G06V20/70G06V10/945
Inventor 马贤忠胡皓瑜江浩董维山范一磊
Owner MOMENTA SUZHOU TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products