Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Contour recognition model training method and target object detection method

A training method and technology for a target object are applied in the training of a contour recognition model, a detection device for a target object, electronic equipment and computer-readable storage media, and can solve the problems of short time-consuming and inability to take into account accurate identification of the target object.

Pending Publication Date: 2020-12-29
SHANGHAI XIAOI ROBOT TECH CO LTD
View PDF9 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In view of this, the embodiment of the present application provides a training method for an outline recognition model, a method for detecting a target object, a device for detecting a target object, an electronic device, and a computer-readable storage medium, so as to solve the problems in the prior art that cannot give consideration to accurate recognition. Targeted and time-consuming technical questions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Contour recognition model training method and target object detection method
  • Contour recognition model training method and target object detection method
  • Contour recognition model training method and target object detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The technical solutions in the embodiments of the present application will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present application. Obviously, the described embodiments are only some of the embodiments of the present application, not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of this application.

[0026] As mentioned above, target detection is a classic problem in the field of computer vision, the purpose is to detect the position, contour and type of the target object in the input image. The target detection method in the prior art uses convolution kernels representing different target categories to perform convolution calculation on the image to be detected, and obtains the position, outline and type of the target object through classification. How...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a contour recognition model training method, and the method comprises the steps: providing basic data of a plurality of samples, wherein the basic data is corresponding to the contour of a recognition object in the samples; acquiring a plurality of training data based on the basic data of each sample; and training a neural network model based on the samples with the multiplepieces of training data, so that the neural network model can output multiple pieces of contour recognition data corresponding to the multiple pieces of training data based on the samples. Since thebasic data of the sample corresponds to the contour of the recognition object in the sample, the plurality of training data corresponds to the plurality of training contours obtained based on the contour of the recognition object in the sample, one sample carries the plurality of training samples, and the neural network model is trained based on the sample carrying the plurality of training data.The training method enables the contour recognition model to have the capability of inputting one sample and outputting a plurality of pieces of contour recognition data corresponding to the pluralityof pieces of training data.

Description

technical field [0001] The present application relates to the field of computer technology, and in particular to a training method for an outline recognition model, a method for detecting a target object, a device for detecting a target object, electronic equipment, and a computer-readable storage medium. Background technique [0002] Target detection is a classic problem in the field of computer vision, the purpose is to detect the position, contour and type of the target object in the input image. The target detection method in the prior art uses the convolution kernel representing different target categories to perform convolution calculation on the image to be detected, and obtains the position, outline and type of the target object through classification, but this method is aimed at the target object or multiple overlapping target objects The detection accuracy is low, the contour of the target object cannot be accurately detected, and multiple overlapping target object...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/46G06K9/62G06N3/04
CPCG06V10/44G06N3/045G06F18/214
Inventor 王晓珂
Owner SHANGHAI XIAOI ROBOT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products