Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data annotation method and device and data annotation model training method and device

A technology for labeling models and training methods, applied in the field of data processing, can solve problems such as low labeling efficiency, manual labeling cannot achieve refined labeling, etc., to improve the degree of proximity, improve labeling accuracy, and speed, purpose and advantages are concise and easy to understand Effect

Pending Publication Date: 2021-10-15
CITY CLOUD TECH HANGZHOU CO LTD
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] This embodiment is used to solve the problem that manual labeling cannot achieve refined labeling and labeling efficiency is low. By performing edge detection on the initial target frame manually marked, the proximity of the edge of the target frame to the actual edge in the image is improved, and the target frame is detected through target detection. The algorithm automatically labels the image, improving the accuracy and speed of labeling

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data annotation method and device and data annotation model training method and device
  • Data annotation method and device and data annotation model training method and device
  • Data annotation method and device and data annotation model training method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0037] The embodiment of this application provides a data labeling method, refer to figure 1 , the method includes the following steps S101 to S104:

[0038] Step S101 , acquiring a set of images to be labeled, and determining the labeling category of the target to be labeled for each of the images to be labeled.

[0039] Step S102 , judging whether there is a corresponding data labeling model for the labeling category of the target to be labeled in each image to be labelled.

[0040] Step S103, if yes, perform object labeling on the image to be labeled by using the data labeling model to obtain a first labeled data set.

[0041] Step S104, if not, obtain the initial target frame of the target to be marked in the image to be marked, detect the target edge position in the initial target frame of the at least one target to be marked, and adjust the at least one object according to the target edge position The initial target frame of the target to be labeled to obtain the secon...

Embodiment 2

[0074] Based on the same idea, refer to Figure 9 , the application also proposes a data labeling device, including:

[0075] An acquisition module 901, configured to acquire a set of images to be labeled, and determine the labeling category of the target to be labeled for each of the images to be labeled;

[0076] A judging module 902, configured to judge whether there is a corresponding data tagging model for the tagging category of the target to be tagged for each image to be tagged;

[0077] The first tagging module 903 is configured to perform target tagging on the image to be tagged through the data tagging model to obtain a first tagged data set;

[0078] The second labeling module 904 is configured to obtain the initial target frame of the target to be marked in the image to be marked, detect the target edge position in the initial target frame of the at least one target to be marked, and adjust the target edge position according to the target edge position. At least...

Embodiment 3

[0080] This embodiment also provides an electronic device, refer to Figure 10 , including a memory 1004 and a processor 1002, the memory 1004 stores a computer program, and the processor 1002 is configured to run the computer program to perform the steps in any one of the above method embodiments.

[0081] Specifically, the above-mentioned processor 1002 may include a central processing unit (CPU), or an application specific integrated circuit (Application Specific Integrated Circuit, ASIC for short), or may be configured to implement one or more integrated circuits in the embodiments of the present application.

[0082] Wherein, the storage 1004 may include a mass storage 1004 for data or instructions. For example without limitation, the memory 1004 may include a hard disk drive (HardDiskDrive, referred to as HDD), a floppy disk drive, a solid state drive (SolidStateDrive, referred to as SSD), flash memory, optical disk, magneto-optical disk, magnetic tape or Universal Seria...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a data annotation method and device and a data annotation model training method and device. The data annotation method comprises the following steps: acquiring a to-be-annotated image set, and determining the annotation type of a to-be-annotated target of each to-be-annotated image; judging whether the labeling category of the to-be-labeled target of each to-be-labeled image has a corresponding data labeling model or not; if yes, performing target annotation on the to-be-annotated image through the data annotation model to obtain a first annotated data set; and if not, acquiring an initial target frame of a to-be-labeled target in the to-be-labeled image, detecting a target edge position in the initial target frame of the at least one to-be-labeled target, and adjusting the initial target frame of the at least one to-be-labeled target according to the target edge position to obtain a second labeled data set. According to the invention, the image which cannot be automatically annotated is annotated by assisting an image processing technology with manual annotation, and the annotation quality and efficiency are improved in a manner of combining automation with the manual annotation.

Description

technical field [0001] The present application relates to the field of data processing, in particular to a data labeling method and device, and a data labeling model training method and device. Background technique [0002] In recent years, urban governance has become a hot spot. Through the extensive monitoring layout and the in-depth application of information technologies such as artificial intelligence and cloud computing, the urban governance model has completed the transformation from traditional monitoring "seeing" to "understanding". Technology promotes the development of urban intelligent construction and is also the only way for urban governance. [0003] In order for the machine to "understand", the machine first needs to recognize the relevant targets in the video image. In the existing technology, the relevant targets in the image will be manually marked, and then the machine will continue to learn, so as to improve the recognition of the corresponding target. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/583G06T7/13G06T5/00G06T5/30G06K9/62G06N3/08
CPCG06F16/583G06T7/13G06T5/30G06N3/084G06N3/088G06T2207/20081G06F18/214G06F18/24G06T5/70
Inventor 徐剑炯方玲洪毛云青董墨江
Owner CITY CLOUD TECH HANGZHOU CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products