Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Moving workpiece recognition method based on spatiotemporal contexts and fully convolutional network

A spatiotemporal context, fully convolutional network technology, applied in the field of digital image processing target detection and recognition, to improve the degree of intelligence and achieve the effect of semantic segmentation and classification

Inactive Publication Date: 2017-12-08
KUNMING UNIV OF SCI & TECH
View PDF4 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention provides a moving workpiece recognition method based on a spatio-temporal context full convolutional network, solves the problem of industrial robots tracking and recognizing moving targets on conveyor belts, and provides a theoretical basis for improving the degree of automation and intelligence of industrial robots

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Moving workpiece recognition method based on spatiotemporal contexts and fully convolutional network
  • Moving workpiece recognition method based on spatiotemporal contexts and fully convolutional network
  • Moving workpiece recognition method based on spatiotemporal contexts and fully convolutional network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0058] Embodiment 1: as Figure 1-9 As shown, the moving workpiece recognition method based on the spatio-temporal context full convolutional network, firstly, use the target image database (5 kinds of common machinery industry tools and workpieces: bearings, screwdrivers, gears, pliers, wrenches) to carry out the full convolutional neural network Train the target classifier to be classified; then, use the background difference method and digital image processing morphology method to obtain the initial position of the target in the first frame of the video sequence, and use the space-time context model target tracking method to track the target to be tracked according to the initial position. The accuracy map verifies the target tracking accuracy; finally, the tracked results are classified and identified using the trained classifier to achieve semantic segmentation, thereby obtaining the target category. Verification of Semantic Classification Recognition Performance by Groun...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a moving workpiece recognition method based on spatiotemporal contexts and a fully convolutional network, and belongs to the fields of digital image processing and object detection and recognition. According to the method, an object image database is utilized to train the fully convolutional neural network to obtain a classifier of a to-be-classified object; then a background difference method and a morphological method of digital image processing are utilized to obtain an initial position of the object in a first frame of a video sequence, an object tracking method of spatiotemporal context models is utilized to track the to-be-tracked object according to the initial position, and object tracking precision is verified through a precision graph; and finally, the trained classifier is utilized to carry out classification recognition on a tracking result, semantic-level segmentation is realized, and thus an object category is obtained. According to the method, the initial position of the moving object can be effectively and automatically acquired by using the background difference method and the morphological method of digital image processing, tracking and recognition for the moving workpiece on a conveyor belt can be realized, and an automation degree and an intelligence degree of an industrial robot are increased.

Description

technical field [0001] The invention relates to a moving workpiece recognition method based on a spatio-temporal context full convolution network, and belongs to the technical field of digital image processing target detection and recognition. Background technique [0002] Under the background of the new era, industrial sites have higher and higher requirements for automation, and the detection and recognition of industrial robots to targets has become one of the hotspots and difficulties in the field of industrial 4.0 advanced manufacturing. The key technologies include: 1) in the motion background Next, obtain the initial position of the workpiece to be grasped, extract the features of the tracking workpiece object, separate the target from the complex moving background, and obtain the real-time position of the moving target; 2) Classify and identify the tracked target pairs to realize the moving background Semantic Segmentation of Artifact Objects. [0003] However, ther...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/2413
Inventor 张印辉张春全何自芬王森田敏
Owner KUNMING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products