Pre-training method and device based on Transform structure

A technology of pre-training and training samples, applied in the field of model training, can solve the problems of high computing cost and huge human and material resources, and achieve the effect of reducing computing pressure

Active Publication Date: 2022-06-28
北京医准智能科技有限公司
View PDF8 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

First, this dynamic nature brings additional difficulties to representation learning; second, letting the model learn video representations from scratch is computationally expensive and requires ve...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pre-training method and device based on Transform structure
  • Pre-training method and device based on Transform structure
  • Pre-training method and device based on Transform structure

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] Exemplary embodiments of the present invention are described below with reference to the accompanying drawings, which include various details of the embodiments of the present invention to facilitate understanding and should be considered as exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted from the following description for clarity and conciseness.

[0028] like figure 1 shown is a schematic flowchart of a pre-training method based on the Transformer structure according to an embodiment of the present invention; as Figure 5 As shown, it is a schematic diagram of determining the characteristic symbols of the image segmentation area in an embodiment of the present invention; Image 6 As shown, it is a schematic diagram of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a pre-training method and device based on a Transform structure. The method comprises the following steps: firstly, acquiring an image and a video of a target object; for any segmented region in the image and the video, taking the feature symbol of the segmented region as the label of the segmented region; mask processing is carried out on the partial segmentation area of the image and the partial segmentation area of the video to obtain a first training sample and a second training sample; performing supervised prediction learning on the feature symbols of the mask region in the first training sample based on a Transform structure to obtain an initial model; initializing the pre-training model based on the initial model to obtain an initial pre-training model; and finally, performing supervised joint training on the first training sample and the second training sample by using the initial pre-training model to obtain a final pre-training model. Therefore, the model learns the spatial features and the time features of the video data at the same time, and a good pre-training model is provided for downstream tasks.

Description

technical field [0001] The invention relates to the technical field of model training, in particular to a pre-training method and device based on a Transformer structure. Background technique [0002] At present, most tasks based on ultrasound images or videos (such as classification, segmentation or detection tasks) rely on a large amount of data, and the labeling of ultrasound data is cumbersome, and different tasks need to be labeled with different content. Pre-training is a method of putting together a large amount of training data collected at low cost, learning the commonalities through model training, and then "transplanting" the commonalities to specific tasks. Self-supervised learning is a learning style that utilizes the input data itself as supervision and benefits almost all types of downstream tasks. [0003] Transformer is the main network structure in the field of natural language processing and has achieved great success in different Neuro-Linguistic Program...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06V10/764G06V10/774G06V10/80
CPCG06F18/24G06F18/253G06F18/214
Inventor 李小星马璐丁佳吕晨翀
Owner 北京医准智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products