Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-group image supervised classification method based on empirical mode decomposition

An empirical modal decomposition and supervised classification technology, applied in the field of image processing, can solve the problems of low classification accuracy and insufficient feature utilization, and achieve the effect of good consistency and improved classification accuracy.

Active Publication Date: 2010-08-11
哈尔滨工业大学高新技术开发总公司
View PDF0 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the problem of insufficient utilization of features and low classification accuracy in existing supervised classification methods, and to provide a supervised classification method for multi-group images based on empirical mode decomposition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-group image supervised classification method based on empirical mode decomposition
  • Multi-group image supervised classification method based on empirical mode decomposition
  • Multi-group image supervised classification method based on empirical mode decomposition

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0023] Specific implementation mode one: the following combination Figure 1 to Figure 10 Describe this implementation mode, this implementation mode comprises the following steps:

[0024] Step 1: Carry out empirical mode decomposition to the training eigenvector formed by each pixel characteristic curve of the training vector in the multi-group image, and obtain the extended training eigenvector of each pixel of the training vector;

[0025] Carry out empirical mode decomposition to the test feature vector formed by each pixel characteristic curve of the test vector in the multi-group image, and obtain the extended test feature vector of each pixel of the test vector;

[0026] The method to obtain the extended training feature vector or the extended test feature vector is:

[0027] Set the input vector signal as x(t), where 1≤t≤N, N is the total number of bands of the multi-group image,

[0028] set r n (t) is the residual trend function of the nth eigenmode function deco...

specific Embodiment approach

[0073] Hyperspectral image is one of the typical multi-group images. The 92AV3C hyperspectral image comes from the remote sensing observation of an agricultural area in northwestern Indiana, USA by the AVIRIS (Airborne Visible / Infrared Imaging Spectrometer) sensor. The data set contains 220 bands (the other 4 bands are all 0 and discarded), from 0.40 μm to 2.45 μm approximately every 10nm, and a reference map of each pixel’s category calibrated through field investigation is attached. This reference image can be used to assist in constructing a training set and calculating the classification accuracy of the classification method with respect to the test set. In this embodiment, the 7 categories with the most pixels in the 92AV3C hyperspectral image are arranged, which provides a sufficient data basis for the cross-validation experiment. The experiment uses up to 5-fold cross validation (5-fold cross validation) experiment to calculate the classification accuracy, which makes ...

specific Embodiment approach 2

[0093] Embodiment 2. The difference between this embodiment and Embodiment 1 is that T=0.25, and the others are the same as Embodiment 1.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a multi-group image supervised classification method based on empirical mode decomposition, belonging to the image processing field. The multi-group image supervised classification method solves the problems that the feature utilization is not enough and the classification accuracy is low in the existing supervised classification method. The method comprises the following steps: 1, carry out empirical mode decomposition to a feature vector of each pixel and obtain an expanded feature vector; 2, combine the original feature vector and the expanded feature vector thereof of each pixel according to an uniform rule and obtain an expended dimension feature vector; 3, carry out training to a support vector machine, then carry out belonging judgment to the categories of an expended dimension testing feature vector, and form a plurality of support vector machine sub classifier; 4, build a multi- classifier based on an one-to-one policy to make decisions to the belonging categories of each pixel, and finish the category of the multi-group image. The multi-group image supervised classification method is used for multi-group image pattern recognition which needs high-precision category.

Description

technical field [0001] The invention relates to a supervised classification method for multi-group images based on empirical mode decomposition, which belongs to the field of image processing. Background technique [0002] Multi-group images are a group of multi-band images with high correlation. There are a large number of physical prototypes in the fields of earth observation, medical diagnosis, and radar detection, such as hyperspectral images, medical ultrasound images, and sea level fluctuation images. They are generally continuous observations or multi-spectral spectroscopic observations of the same area, and often contain images of hundreds or thousands of bands, and the images of each band generally have a high correlation. Distinguish information, and there is a large amount of redundant information. [0003] Each pixel in a multi-group image corresponds to a characteristic curve covering each band. Directly using the characteristic curve of each pixel as a feature...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
Inventor 沈毅张淼王艳金晶林玉荣
Owner 哈尔滨工业大学高新技术开发总公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products