Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and apparatus for obtaining training data

A technology for training data and data, applied in the field of acquiring training data, can solve the problems of video occupation, missed annotation by annotators, large storage space, etc., and achieve the effect of efficient acquisition

Inactive Publication Date: 2017-03-22
BEIJING KUANGSHI TECH +1
View PDF6 Cites 37 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method mainly has the following problems: (1) in order to obtain enough relevant target objects (such as vehicles, pedestrians, etc.), a large number of videos need to be recorded, and the pre-recorded videos occupy a large amount of storage space; (2) in many scenes, the video Most of the content does not contain relevant target objects, and processing these empty data will take up a lot of time for the annotators; (3) In natural scenes, affected by lighting and other effects, the annotators will miss the label; (4) The annotators are accurate Labeling individual goals takes a lot of effort and time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for obtaining training data
  • Method and apparatus for obtaining training data
  • Method and apparatus for obtaining training data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail below with reference to the accompanying drawings. Apparently, the described embodiments are only some embodiments of the present invention, rather than all embodiments of the present invention, and it should be understood that the present invention is not limited by the exemplary embodiments described here. Based on the embodiments of the present invention described in the present invention, all other embodiments obtained by those skilled in the art without creative effort shall fall within the protection scope of the present invention.

[0027] First, refer to figure 1 An example electronic device 100 for implementing the method and apparatus for acquiring training data according to the embodiments of the present invention will be described.

[0028] Such as figure 1 As shown, th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method and apparatus for obtaining training data. The method comprises the following steps: receiving original video data; detecting a target object in the original video data by use of a trained first neural network, and automatically marking the detected target object; and based on the automatically marked video data, obtaining the training data used for training a second neural network. According to the method and apparatus for obtaining the training data, provided by the embodiments of the invention, the target object in the original video data is automatically marked by use of the trained neural network, for obtaining the training data used for training a target neural network, such that enormous time spent by manual marking personnel in performing manual marking on the original video data can be saved, the problem of possible omission of the target object due to the manual marking can also be effectively avoided, and high-quality training data is obtained efficiently.

Description

technical field [0001] The present invention relates to the technical field of video processing, and more particularly to a method and device for acquiring training data. Background technique [0002] Neural networks, which play a vital role in the field of computer vision (such as face recognition, object detection, autonomous driving, etc.), have become the dominant technology in this field. Before the neural network can be put into use, it is necessary to use a large number of labeled pictures as training samples to train the neural network. In most cases, the amount and quality of training data has a significant impact on the performance of a neural network. Therefore, large-scale and high-quality training data acquisition is very important. [0003] For neural networks that use video streams as training data, the existing data acquisition method is to pre-record multiple videos, and then send the videos to annotators for frame-by-frame annotation. This method mainly ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/08
CPCG06N3/08G06V20/47
Inventor 肖特特茅佳源
Owner BEIJING KUANGSHI TECH
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More