Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-modal data fusion method and system based on discriminant multi-modal deep confidence network

A deep confidence and data fusion technology, applied in the direction of electrical digital data processing, special data processing applications, instruments, etc., can solve the problem of reducing the quality of data fusion

Active Publication Date: 2014-06-04
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF5 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In fact, since each modality contains its modality-related characteristics, it will have a negative impact on the data fusion process, which will greatly reduce the quality of the final data fusion; another type of work is to use "deep" layers Although these models can handle the complex characteristics of multimodal data well, a large number of existing models are generative models, which are not suitable for discriminative tasks, such as classification and retrieval.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-modal data fusion method and system based on discriminant multi-modal deep confidence network
  • Multi-modal data fusion method and system based on discriminant multi-modal deep confidence network
  • Multi-modal data fusion method and system based on discriminant multi-modal deep confidence network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with specific embodiments and with reference to the accompanying drawings.

[0018] According to one aspect of the present invention, a multimodal data fusion method based on a discriminative multimodal deep belief network is proposed, which can be widely applied to classification and retrieval problems of multimodal data.

[0019] figure 1 Shows the flow chart of the multimodal data fusion method based on the discriminative multimodal deep belief network proposed by the present invention, as figure 1 As shown, the method includes the following steps:

[0020] Step 1. Establish a discriminative multimodal deep belief network, and set the number of layers and nodes of the network;

[0021] Wherein, the discriminative multimodal deep belief network is a multi-layer network structure, including a d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-modal data fusion method based on a discriminant multi-modal deep confidence network. The multi-modal data fusion method based on the discriminant multi-modal deep confidence network comprises the steps that the discriminant multi-modal deep confidence network is established; for the deep confidence network corresponding to multi-modal data, the weight of the network after the deep confidence network is optimized is obtained by means of limited Boltzmann machines; objective functions of the multi-modal Boltzmann machines are minimized by means of the alternative optimization strategy, the weights of the optimized Boltzmann machines are obtained, and a final discriminant multi-modal deep confidence network model is obtained; the multi-modal data to be fused are input into the deep confidence network model, and then a fusion result is obtained. The invention further discloses a multi-modal data fusion system based on the discriminant multi-modal deep confidence network. According to the multi-modal data fusion method and system based on the discriminant multi-modal deep confidence network, monitored label information is introduced into a traditional multi-modal deep confidence network, the relations between the data with different modals are mined in a discriminant mode, and thus the high accuracy rate can be guaranteed during a large-scale multi-modal data classifying and searching task.

Description

technical field [0001] The invention relates to the fields of pattern recognition and machine learning, in particular to a multimodal data fusion method and system based on a discriminative multimodal deep belief network. Background technique [0002] A concept or content can usually be represented by multiple modal data. For example, an image and its corresponding text annotation are two modal data representing the same content; when people are reading, lip movements and corresponding sounds are also two modal data. corresponding modal data. Extensive works have confirmed that multiple modality data can provide a more comprehensive description of concepts than single modality data, which can potentially help some common pattern recognition problems, such as classification and retrieval. The general practice of multimodal data fusion is to fuse multimodal data into the same expression, and then this common expression can be further used for subsequent classification or retr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/30
CPCG06F16/2458
Inventor 王亮谭铁牛王威黄岩
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products