Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Feature extraction network training method and device, terminal equipment and medium

A feature extraction and training method technology, applied in the field of image processing, can solve the problems of irrelevant information, unable to reflect the complete information of objects, low matching degree of downstream tasks, etc., to achieve the effect of improving matching degree and good consistency

Pending Publication Date: 2022-04-29
SHENZHEN INTELLIFUSION TECHNOLOGIES CO LTD
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Self-supervised learning methods have been widely used in the field of target detection, but the intercepted areas of the existing self-supervised learning methods cannot reflect the complete information of the object when sampling on the original image, or contain a large amount of irrelevant information.
Therefore, the currently learned features are incomplete or redundant, making the corresponding object detection poorly matched with the downstream tasks of segmentation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature extraction network training method and device, terminal equipment and medium
  • Feature extraction network training method and device, terminal equipment and medium
  • Feature extraction network training method and device, terminal equipment and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] In the following description, specific details such as specific system structures and technologies are presented for the purpose of illustration rather than limitation, so as to thoroughly understand the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments without these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.

[0029] It should be understood that when used in this specification and the appended claims, the term "comprising" indicates the presence of described features, integers, steps, operations, elements and / or components, but does not exclude one or more other Presence or addition of features, wholes, steps, operations, elements, components and / or collections thereof.

[0030] It should...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention is suitable for the technical field of image processing, and provides a feature extraction network training method and device, terminal equipment and a medium, and the method comprises the steps: determining a background interval of each pixel point in a video frame sequence; according to the background interval, foreground pixel points in a target video frame are determined, and the target video frame is any video frame in the video frame sequence; acquiring a foreground image in each target video frame according to the foreground pixel points; determining a positive sample and a negative sample of a foreground image of the target video frame to construct a positive sample pair and a negative sample pair; and adopting the positive sample pair and the negative sample pair to carry out self-supervised training on a preset feature extraction network. The feature extraction network obtained by training through the method is high in matching degree in downstream tasks of target detection and segmentation.

Description

technical field [0001] The present application belongs to the technical field of image processing, and in particular relates to a training method, device, terminal equipment and medium of a feature extraction network. Background technique [0002] Self-supervised learning (self-supervised learning) mainly uses auxiliary tasks to mine its own supervision information from large-scale unsupervised data, and trains the network through the supervised information of this structure, so that it can learn valuable representations for downstream tasks. . [0003] Self-supervised learning methods have been widely used in the field of target detection, but the regions intercepted by existing self-supervised learning methods when sampling on the original image cannot reflect the complete information of the object, or contain a large amount of irrelevant information. Therefore, the currently learned features are incomplete or redundant, making the corresponding object detection poorly ma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V20/40G06V10/40G06V10/774G06V10/82G06T7/194G06N3/08
CPCG06T7/194G06N3/08G06T2207/10016G06T2207/20081G06T2207/20084G06F18/214
Inventor 李百双胡文泽黄坤
Owner SHENZHEN INTELLIFUSION TECHNOLOGIES CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products