Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Article identification method for efficiently labeling samples

A recognition method and sample labeling technology, which is applied in the field of item recognition with high-efficiency sample labeling, can solve problems such as low efficiency of image processing and recognition, large proportion of background area, and slow model training, so as to save sample labeling time, save time, and improve The effect of detection speed

Active Publication Date: 2020-08-11
青岛联合创智科技有限公司
View PDF12 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to overcome the shortcomings existing in the existing item identification process, and to provide a sample efficient labeling method for the defects of the current item identification method, such as the large number of samples, the slow model training, the large proportion of the background area in the target area, and the low efficiency of graphic processing and identification. item identification method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Article identification method for efficiently labeling samples
  • Article identification method for efficiently labeling samples
  • Article identification method for efficiently labeling samples

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0052] An item identification method for efficiently labeling samples involved in this embodiment, the specific process steps are as follows:

[0053] S1. Item subject detection and category prediction

[0054] Use the item detection algorithm to train the item detection model, then locate the area of ​​interest to the user in the video image, and predict the corresponding category according to the item detection model; the item detection algorithm uses the improved mask rcnn algorithm, the specific process is as follows:

[0055] S11. Training sample library

[0056] According to the needs, prepare static pictures of the corresponding category as training samples for training to form a training data set; the training data set includes training 16 types of items, namely: knife, cup, remote control, shoulder bag, mobile phone, scissors, laptop, mouse , backpacks, keys, wallets, glasses, umbrellas, fans, puppies, and kittens; the training data set mainly includes 3 parts, namel...

Embodiment 2

[0083] The item identification method with efficient sample labeling involved in Embodiment 1 can be used for item search. During item search, image acquisition is performed on targets along the line according to the planned path, and the target is predicted by image processing of acquired video frames through the item identification method with efficient sample labeling. The category, based on the depth map to obtain the real distance between the algorithm target and the camera, check whether the detected target category is consistent with the target category to be found, and tell the user the specific location of the target by voice broadcast after verification; there are two types of item search Situation: search for general items and search for sub-categories; this step uses a robot as an application example to explain the process of item search;

[0084] S21. Search for general items

[0085] Large-category item search refers to searching for a certain type of item in the...

Embodiment 3

[0120] Its main body structure of the robot involved in embodiment 2 includes a binocular camera, a controller, a voice interaction module, a drive unit and a power supply; the robot head is equipped with a binocular camera, and the binocular camera is used for video image acquisition; the binocular camera and the robot body The electrical information of the internal controller is connected, and the controller is electrically connected to the power supply; the voice interaction module is set on the surface of the robot body, and the voice interaction module is connected to the electrical information of the controller. The voice interaction module is used for voice interaction between the user and the robot, and adding subclass samples ; The lower part of the robot is provided with a drive unit, the drive unit adopts the existing crawler or wheel drive structure, and the drive unit is electrically connected with the controller.

[0121] The controller involved in this embodiment...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of intelligent identification of articles, and relates to an article identification method for efficiently labeling samples. The method comprises the following process steps: S11, preparing static pictures of corresponding categories as training samples according to requirements for training to form a training data set; S12, performing image annotation, and combining all the sample annotation files to obtain final training sample data for training an article detection model; s13, performing model training by adopting a resnet-101 backbone network; performingmodel training operation for multiple times on the basis of the existing model by modifying the training parameters until a model meeting the requirements of a user is obtained; s14, performing targetdetection by adopting a mask rcnn algorithm to obtain a prediction category, contour information of a segmented target area and a bounding box. According to the method, background interference is reduced, the accuracy of target matching is effectively improved, the calculated amount is reduced, and the target matching speed is increased; meanwhile, the sample labeling mode can greatly reduce thesample labeling time, and manpower and time are saved.

Description

Technical field: [0001] The invention belongs to the technical field of object intelligent identification, and relates to an object identification method for rapidly labeling target detection samples and a small detection target outline background, in particular to an item identification method for efficiently labeling samples. Background technique: [0002] In daily life, people usually place all kinds of daily necessities at random. When a certain item is needed, it is often difficult to find it in time because of the messy placement of various items. Big inconvenience and troubles, waste of energy and time of the user; and the identification of intelligent items has a large number of sample data, and the training of the sample model is slow. difficulty and efficiency. [0003] In the prior art, the Chinese patent with the publication number CN109241854A discloses a method and device for finding items based on a robot. The method includes: determining the lost item inform...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/32G06K9/00G06N3/04G06T7/13G06T7/60
CPCG06T7/13G06T7/60G06T2207/10016G06T2207/30204G06V20/40G06V20/10G06V10/25G06N3/045G06F18/2431G06F18/214Y02P90/30
Inventor 纪刚商胜楠
Owner 青岛联合创智科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products