Method for training object classification model and identification method using object classification model

A training method and object classification technology, applied in the training field of object classification models, can solve the problems of not making good use of inter-frame information and not considering the influence of object features.

Active Publication Date: 2011-04-06
WUXI ZGMICRO ELECTRONICS CO LTD
View PDF4 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0012] Recently, some people use block Local Binary Pattern (LBP) to calculate sample features in this object type training and recognition method. Although the block LBP represents the neighborhood information of the object to a certain extent, this method is used in Some images can also achieve good results, but this method has not made good use of inter-frame information
On the other hand, in the existing object type training and recognition methods, the extraction of the foreground object area is simply zoomed in, and the influence of different viewing angles in the image on the object features is not considered.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for training object classification model and identification method using object classification model
  • Method for training object classification model and identification method using object classification model
  • Method for training object classification model and identification method using object classification model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] The detailed description of the present invention directly or indirectly simulates the operation of the technical solution of the present invention mainly through programs, steps, logic blocks, processes or other symbolic descriptions. In the ensuing description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. Rather, the invention may be practiced without these specific details. These descriptions and representations herein are used by those skilled in the art to effectively convey the substance of their work to others skilled in the art. In other words, for the purpose of avoiding obscuring the present invention, well-known methods, procedures, components and circuits have not been described in detail since they are readily understood.

[0044] Reference herein to "one embodiment" or "an embodiment" refers to a particular feature, structure or characteristic that can be included in at least one implementa...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for training an object classification model. The method comprises the following steps of: acquiring a plurality of frames of images, and determining initial foreground object areas from the images; performing principal axis normalization conversion on the initial foreground object areas in the images of different frames; aligning the corresponding initial foreground object areas subjected to the principal axis normalization conversion in the images of different frames to acquire a final foreground object area; extracting characteristics from the final foreground object area; and training the object classification model by using the extracted characteristics. By the principal axis normalization conversion of the extracted foreground object areas, the method overcomes the influence of different visual angles on the shapes of the foreground object areas and reduces intra-class distance; and meanwhile, the foreground object areas on the images of different frames are aligned by adopting block matching technology, and the extracted effective foreground object area in the foreground object areas is used as the final foreground object area.

Description

【Technical field】 [0001] The invention relates to the technical field of pattern recognition, in particular to the training of an object classification model and a recognition method using the model. 【Background technique】 [0002] Object type recognition is an important research topic in the field of intelligent video surveillance, and it is the application of computer image processing technology and pattern recognition technology in the field of intelligent video surveillance. Object type recognition technology is widely used in road monitoring, financial banking, waterway management and other industries. [0003] In the existing object type recognition technology, there are many different methods of object type recognition technology, such as: the object type recognition technology based on heuristic rules, which uses simple information such as the shape, size, and proportion of the object to extract Foreground objects are analyzed to obtain the object type. This method ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/66
Inventor 邓亚峰
Owner WUXI ZGMICRO ELECTRONICS CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products