Indoor scene recognition method combining deep learning and sparse representation

An indoor scene, sparse representation technology, applied in the field of image processing and indoor scene recognition, can solve the problems of small occlusion between large categories, poor recognition effect, complexity, etc., to achieve high practical performance, improve recognition rate, and improve accuracy. Effect

Active Publication Date: 2017-05-10
NANJING UNIV OF POSTS & TELECOMM
View PDF2 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is to overcome the deficiencies of the prior art, provide an indoor scene recognition method that combines deep learning and sparse representation, and solve the problems of small differences, occlusion, scale, and angle changes due to the current indoor scene intra-category differences. The current indoor scene recognition is more complex and difficult than the outdoor scene recognition, so the recognition effect is poor, so as to improve the recognition rate and robustness of the indoor scene recognition algorithm.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor scene recognition method combining deep learning and sparse representation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] Embodiments of the present invention will be described below in conjunction with the accompanying drawings.

[0026] Such as figure 1 As shown, the present invention designs an indoor scene recognition method combining deep learning and sparse representation, which includes three major steps of bottom-level feature extraction, middle-level feature construction and classifier design, specifically including the following steps:

[0027] Step A. Randomly select several indoor scene images from the indoor scene library as training samples, and use the remaining indoor scene images in the indoor scene library as test samples.

[0028] Because the present invention is applied to indoor scene image, in order to detect the validity of algorithm, should select the picture in the indoor scene storehouse that is disclosed in the world, has chosen typical MIT-67 indoor scene storehouse in this example, the picture in this scene storehouse and It is not uniform in size, so it is pr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an indoor scene recognition method combining deep learning and sparse representation, which comprises the steps of randomly selecting a plurality of indoor scene images from an indoor scene library to act as a training sample, and enabling the remaining indoor scene images to act as a test sample; performing object category discrimination and detection on the training sample and the test sample by using a Fast-RCNN algorithm so as to build low-level features of each indoor scene image; combining the low-level features and spatial features of each indoor scene image by using a bag-of-words model so as to build middle-level features; mixing the middle-level features of the training sample so as to build a sparse dictionary; performing sparse representation on the test sample by using the sparse dictionary, calculating a residual error according to a solved sparse solution and the inputted test sample, and judging an object category to which the test sample belongs according to the residual error; and outputting the judged object category to which the test sample belongs. The indoor scene recognition method can accurately recognize an indoor scene, can effectively improve the accuracy and the robustness of indoor scene recognition and has very high practical performance.

Description

technical field [0001] The invention relates to an indoor scene recognition method combined with deep learning and sparse representation, and belongs to the technical field of image processing technology. Background technique [0002] With the development and popularization of information technology and intelligent robots, scene recognition, as an important research content, has become an important research issue in the field of computer vision and pattern recognition. Scene image classification is the automatic classification of image datasets according to a given set of semantic labels. The scene recognition model is mainly divided into three major blocks: based on low-level features, based on mid-level features, and based on visual vocabulary. The so-called low-level features are to extract the global or block texture, color and other features of the scene image to classify the scene image, such as the research of Valiaya and Szumme et al., but this method of extracting ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/08
CPCG06N3/08G06F18/241G06F18/214
Inventor 孙宁朱小英刘佶鑫李晓飞
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products