Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

An object appearance detection method and depth neural network model

A deep neural network and appearance detection technology, applied in biological neural network models, neural architecture, character and pattern recognition, etc., can solve the problems of partial rule definition, inability to judge the situation that has not been encountered, and achieve low hardware cost, overcome the Inefficient manual operation and simple modules

Pending Publication Date: 2019-01-15
深圳宇骏视觉智能科技有限公司
View PDF0 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method can improve work efficiency, because the formulation of the rules mainly depends on the operator's past experience and previous test data, and it is difficult for humans to find the mutual mapping relationship of various data among a large amount of data, which leads to rule definition. The rules themselves have serious flaws, and this method cannot completely solve this problem in principle.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An object appearance detection method and depth neural network model
  • An object appearance detection method and depth neural network model
  • An object appearance detection method and depth neural network model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] In order to enable those skilled in the art to better understand the solutions of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only It is a part of the embodiments of the present invention, not all the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present invention.

[0037] The terms "first", "second", and "third" in the specification and claims of the present invention and the aforementioned drawings are used to distinguish different objects, rather than to describe a specific sequence. In addition, the term "including" and any variations of them are intended to cover non-exclusive inclusion. For example,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method for detecting the appearance of an object and a depth neural network model. The method for detecting the appearance of an object comprises the following steps: building a depth neural network model; deep neural network model has the ability to distinguish the appearance through the cycle of training and learning; according to the large number of cyclic training and learning, the depth neural network model gradually converges to obtain the optimal weight of each eigenvalue; convolution operation is carried out on the appearance picture of the object to be detected, and the classification result is obtained according to the set probability interval value. The depth neural network model comprises a training module, an evaluation module and a prediction module. The invention solves many problems existing in manually formulating judgment rules of object appearance classification, and overcomes the problems of low manual operation efficiency or low accuracyof traditional automatic judgment. At the same time, because of its sustainable self-iterative upgrading, its recognition efficiency will be improved in theory. The module of the invention is simple,the hardware cost is low, and the application range is wide.

Description

【Technical Field】 [0001] The invention relates to the technical field of intelligent image recognition, and relates to automatic detection of the appearance of an object, in particular to a method and a deep neural network model for detecting the appearance of an object. 【Background technique】 [0002] With the continuous development of industrial production and the continuous increase of people's material demand, the annual output of some consumer electronic products has exceeded 100 million units. At present, the mainstream object appearance inspection method still relies on the manual visual inspection of the production line operators, and the inspection is carried out according to the defined judgment standards. The manual inspection program requires a lot of manpower, and has a high error rate and low efficiency, which is likely to cause a bottleneck in production capacity and a high return to the factory. In order to improve efficiency, save manpower, and reduce customer c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04
CPCG06N3/045G06F18/24G06F18/214
Inventor 王新维
Owner 深圳宇骏视觉智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products