Unlock instant, AI-driven research and patent intelligence for your innovation.

Data processing method and device for low computing power processing equipment

A technology of processing equipment and computing power, applied in the computer field, can solve problems such as fast running speed, high precision, and compact neural network structure, and achieve the effect of improving accuracy and speed

Pending Publication Date: 2020-05-19
BEIJING TUSEN ZHITU TECH CO LTD
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] However, the neural network obtained according to the above scheme still cannot better meet the requirements of compact structure, fast running speed, and high precision, so it cannot perform real-time calculations on devices with low computing power.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data processing method and device for low computing power processing equipment
  • Data processing method and device for low computing power processing equipment
  • Data processing method and device for low computing power processing equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0042] see figure 1 , is a flowchart of a method for constructing a neural network in an embodiment of the present invention, and the method includes:

[0043] Step 101 , constructing an initial neural network, wherein a plurality of specific structures preset in the initial neural network are respectively set with corresponding sparse scaling operators, wherein the sparse scaling operators are used to scale the output of the corresponding specific structures.

[0044] Step 102: Use preset training sample data to train the weight of the initial neural network and the sparse scaling operator of a specific structure to obtain an intermediate neural network.

[0045] Step 103: Delete the specific structure in the intermediate neural network whose sparse scaling operator is zero to obtain the target neural network.

[0046] Preferably, the foregoing step 101 can be implemented through the following steps A1 to A3:

[0047] Step A1, select a neural network model.

[0048]In this...

Embodiment 2

[0108] Based on the same inventive concept as the method for constructing a neural network provided in the first embodiment, the second embodiment of the present invention provides an apparatus for constructing a neural network. The structure of the apparatus is as follows: Image 6 shown, including:

[0109] The first construction unit 61 is configured to construct an initial neural network, wherein a plurality of specific structures preset in the initial neural network are respectively provided with corresponding sparse scaling operators, wherein the sparse scaling operators are used to perform the output of the corresponding specific structures. zoom;

[0110] A training unit 62, configured to use preset training sample data to train the weight of the initial neural network and the sparse scaling operator of a specific structure to obtain an intermediate neural network;

[0111] The second construction unit 63 is configured to delete the specific structure in the intermedi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a data processing method and device for low computing power processing equipment. The method comprises the following steps: in a real-time computer vision processing process, processing equipment with low computing power acquires image data; the processing equipment performs computer vision processing on the acquired image data by using a preset neural network to obtain a computer vision processing result, wherein the preset neural network is a target neural network obtained through the following processing. An initial neural network is constructed, a plurality of preset specific structures in the initial neural network are provided with corresponding sparse scaling operators respectively, and the sparse scaling operators are used for scaling output of the corresponding specific structures; the weight of the initial neural network and a sparse scaling operator of a specific structure are trained by adopting preset training sample data to obtain an intermediate neural network; and a specific structure of which the sparse scaling operator is zero in the intermediate neural network is deleted to obtain a target neural network.

Description

technical field [0001] The present invention relates to the field of computers, in particular to a data processing method and device for processing equipment with low computing power. Background technique [0002] In recent years, deep neural networks have achieved great success in many fields, such as computer vision, natural language processing, etc. However, the model of deep neural network often contains a large number of model parameters, and the calculation amount is large and the processing speed is slow. A device with low computing power means that if the computing power of a device is lower than the computing power required by the computing task or computing model deployed on it, the device is a device with low computing power. [0003] To solve this problem, some solutions are currently proposed: [0004] Scheme 1. Hao Zhou pointed out in the paper "Less is More: Towards Compact CNNS" that the existing convolutional neural network includes a large number of param...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063G06N3/08
CPCG06N3/08G06N3/063G06N3/061
Inventor 王乃岩黄泽昊
Owner BEIJING TUSEN ZHITU TECH CO LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More