Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-moving-object feature expressing method suitable for different scenes

An array element and transducer technology, applied in the field of moving object recognition, can solve the problems of redundancy, low contribution, useless invariant moment value, etc.

Inactive Publication Date: 2013-09-04
JIANGSU UNIV
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The classification techniques that combine different types of invariant moments in the above methods are all aimed at specific moving objects in specific scenes in specific fields, and the field of video surveillance is very wide, applicable to different scenarios such as land and water transportation, residential areas, and intelligent buildings. There are also many types of moving objects that need to be identified. In order to classify different moving objects in different scenes, the more invariant moment feature values ​​used, it does not mean that the recognition ability is stronger, and there may be redundancy in all the invariant moment value sets. low contribution, even useless invariant moment values, these redundant invariant moment values ​​will reduce the recognition rate of moving objects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-moving-object feature expressing method suitable for different scenes
  • Multi-moving-object feature expressing method suitable for different scenes
  • Multi-moving-object feature expressing method suitable for different scenes

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0042] An example in the road monitoring scenario, the process is as follows figure 2 shown.

[0043] Step 1: Select {ordinary car Car, pedestrian Person, bus Bus, medium-sized bus Van, bicycle Bicycle} moving object category.

[0044] Locate the N moving objects appearing in the road surveillance video as class Class, and the class set is {CCar, CPerson, Cbus, Cvan, CBicycle}, and then extract the specific object corresponding to each class in the video, CCar class Corresponding extraction {Car1, Car2, Car3, ... Cari}, CPerson class corresponding extraction {Person1, Person2, Person3, ...Personi}, CBus class corresponding extraction {Bus1, Bus2, Bus3, ... Busi}, CVan class corresponding extraction {Van1, Van2, Van3, ...Vani}, CBicycle class corresponds to extract {Bicycle1, Bicycle2, Bicycle3, ...Bicyclei}. For each Car object, calculate various invariant moment values ​​of its various angle forms, such as: , ,...

[0045] , other moving objects in road monitoring...

Embodiment 2

[0072] Example in the river channel monitoring scenario:

[0073] Step 1: Select {boat Boat, ordinary car Car, small crane SmallCrane, medium-sized crane Medium-sizedCrane, pedestrian Person} moving object category.

[0074]Locate the N moving objects appearing in the river channel monitoring video as class Class, and the class set is {CBoat, CCar, CSmallCrane, CMedium-sizedCrane, CPerson}, and then extract the specific objects corresponding to each class in the video, The CBoat class corresponds to extract {Boat1, Boat2, Boat3, ...Boati}, and so on, to calculate the specific objects of all categories.

[0075] For each Boat object, calculate various invariant moment values ​​of its various angle forms, such as: , ,... Using the mean value and formula (1) to further calculate the initial input data as follows:

[0076] , , ,...

[0077] For other moving objects in river channel monitoring, refer to the calculation process of the Boat object.

[0078] The sec...

Embodiment 3

[0087] Example in Community Monitoring Scenario

[0088] Step 1: Select {Bicycle, Medium-sized Van, Ordinary Car, Pedestrian Person} sports object category.

[0089] Locate the N moving objects appearing in the community monitoring video as a class (Class), and the class set is {CBicycle, CVan, CCar, CPerson}, and then extract the corresponding specific object for each class in the video, CBicycle class Correspondingly extract {Bicycle1, Bicycle2, Bicycle3, ... Bicyclei}, and so on, to calculate the specific objects of all categories.

[0090] For each Bicycle object, calculate the various invariant moment values ​​of its various angle forms, such as: , ,...

[0091]

[0092] Using the mean value and formula (1) to further calculate the initial input data as follows:

[0093] , , ,...

[0094] The moving objects in other cell monitoring refer to the calculation process of the Bicycle object.

[0095] The second step: use the formula (2) to calculate all th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-moving-object feature expressing method suitable for different scenes. Aiming at the feature that different movable objects are different in characteristic, a self-adaption combined invariant moment value method is provided, and an invariant moment value which is selected in a dynamic mode is used for describing the characteristics of different moving objects. Through a definition similar frequency-opposite singular frequency method, namely, an SF-ISF method, a weighted value of the invariant moment value of each object is calculated, and then the weighted value of the invariant moment value and a combined invariant moment value are used as input parameters; multiple types of classifier models are established and classification is conducted on various moving objects in a scene. The multi-moving-object feature expressing method suitable for different scenes can effectively reduce computation time, is high in rate of identification of the moving objects, suitable for being used for identifying the moving objects which are monitored in real time, and applicable to various video monitoring scenes.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a method for recognizing a moving object. Background technique [0002] Moments are used in statistics to characterize the distribution of random quantities. If the binary image or grayscale image is regarded as a two-dimensional density distribution function, its image features are described by moments. The moment feature belongs to one of the regional characteristics, and the invariant moment Image recognition is carried out by extracting the mathematical features of the image with translation, rotation and scale invariance. The theory of invariant moments was proposed in 1962. Since its development, it has been continuously evolving and improving, forming a very large variety of types. Each type of invariant moments has its corresponding specialty data calculation category, corresponding to the same type of invariant moments. Different magnitudes also hav...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/66
Inventor 陈潇君詹永照柯佳汪满容陈小波
Owner JIANGSU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products