Depth feature extraction method for three-dimensional model

A 3D model and deep feature technology, applied in 3D object recognition, character and pattern recognition, instruments, etc., can solve problems such as large memory storage space, long network learning time, and inability to fully express 3D model information

Inactive Publication Date: 2017-08-25
FOSHAN UNIVERSITY
View PDF1 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The current feature extraction method based on deep learning has the following problems: for example, the features extracted by the deep learning framework cannot fully express the 3D mod...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth feature extraction method for three-dimensional model
  • Depth feature extraction method for three-dimensional model
  • Depth feature extraction method for three-dimensional model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0064] Such as Figure 1 to Figure 4 As shown, the depth feature extraction method of the three-dimensional model of the present invention is as follows:

[0065] First, extract the polar view of the 3D model as the training input data for the deep convolutional neural network;

[0066] Secondly, construct a deep convolutional neural network and train the polar view; among them, the deep convolutional neural network includes an input layer with the polar view as the training input data, which is used to learn the features of the polar view and obtain a two-dimensional feature map The convolutional layer, the pooling layer used to aggregate the two-dimensional feature maps of different positions and reduce the feature dimension, the fully connected layer used to arrange and link the two-dimensional feature maps to form a one-dimensional vector, and the output The output layer of the category prediction result;

[0067] Again, input the polar view into the deep convolutional n...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a depth feature extraction method for a three-dimensional model. The method includes the steps of firstly, extracting a polar view of a three-dimensional model as training input data of a deep convolutional neural network; secondly, constructing the deep convolutional neural network, and training the polar view; thirdly, inputting the polar view to the deep convolutional neural network for training till the convergence of the deep convolutional neural network, and determining an internal weight of the trained deep convolutional neural network; and fourthly, inputting the polar view of the three-dimensional model whose feature is to be extracted to the trained deep convolutional neural network, and calculating a feature vector of a full-link layer in the deep convolutional neural network as the depth feature of the three-dimensional model whose feature is to be extracted. According to the invention, the deep convolutional neural network is constructed, the weight is corrected through iterations, the residual is reduced, and thus the network converges. After the training is completed, the full-link layer of the convolutional neural network is extracted as the depth feature of the polar view of the three-dimensional model.

Description

technical field [0001] The invention relates to the technical field of three-dimensional model processing, and more specifically, relates to a deep feature extraction method of a three-dimensional model. Background technique [0002] With the rapid development of 3D model processing technology and computer hardware and software, as well as the promotion of multimedia technology and Internet technology, a large number of 3D models are used in various fields, and people's demand for 3D model applications is also increasing. 3D models play an important role in many fields such as e-commerce, architectural design, industrial design, advertising film and television, and 3D games. The 3D models of large-scale data sets need to reuse design and model retrieval in all aspects of social production and life. Therefore, how to quickly and accurately retrieve the target 3D model from the existing various types of 3D model data sets has become the key to be solved urgently. question. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/46
CPCG06V20/64G06V10/44
Inventor 周燕曾凡智
Owner FOSHAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products