Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fringe projection time phase unwrapping method based on deep learning

A time phase, fringe projection technology, applied in the field of optical measurement, to achieve the effect of less error points and high accuracy

Active Publication Date: 2019-01-22
NANJING UNIV OF SCI & TECH
View PDF7 Cites 57 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, for the 3D imaging technology based on fringe projection profilometry, there is still a lack of a method with both measurement accuracy and measurement efficiency.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fringe projection time phase unwrapping method based on deep learning
  • Fringe projection time phase unwrapping method based on deep learning
  • Fringe projection time phase unwrapping method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0039] For verifying the effectiveness of the method described in the present invention, use a camera (model acA640-750um, Basler), a DLP projector (model LightCrafter 4500PRO, TI) and a computer to construct a set of fringe projection based on deep learning Three-dimensional measurement setup for time-phase unwrapping methods, such as figure 2 shown. The shooting speed of this set of devices is 25 frames per second when performing three-dimensional measurement of objects. Project and collect four sets of three-step phase-shifted grating images with different frequencies as described in step 1. The frequencies of the four sets of grating patterns are 1, 8, 32, and 64, respectively. Using step 2, four groups of wrapping phase diagrams with different frequencies can be obtained. By using step three, the periodic order diagram and the absolute phase diagram of the phase with a frequency of 64 can be obtained. Use step 4 to build the image 3 The residual convolutional neural...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fringe projection time phase unwrapping method based on deep learning. Firstly, four sets of three-step phase shift grating patterns are projected to a to-be-tested object, the frequencies are 1, 8, 32 and 64 respectively, and a camera collects a raster image and obtains a wrapped phase image by using a three-step phase shift method; then, a multi-frequency algorithm based on time phase unwrapping is used for carrying out phase unwrapping on the wrapped phase image to obtain a periodic level map of a phase with the frequency of 64; a residual convolutional neural network is constructed; the input data is set to be the wrapped phase image with the frequencies of 1 and 64, and the output data is the periodic level map of the phase with the frequency of 64; finally,a training set and a verification set are made to train and verify the network; and the network verifies a test set to output the periodic level map of the phase with the frequency of 64. According tothe fringe projection time phase unwrapping method based on deep learning in the invention, the deep learning method is adopted and the wrapped phase image with the frequency of 1 is used for unwrapping the wrapped phase image with the frequency of 64; and an absolute phase image with less error points and higher accuracy can be obtained.

Description

technical field [0001] The invention belongs to the technical field of optical measurement, and in particular relates to a deep learning-based fringe projection time phase expansion method. Background technique [0002] In recent decades, rapid 3D shape measurement technology has been widely used in various fields, such as intelligent monitoring, industrial quality control and 3D face recognition. Among the many three-dimensional shape measurement methods, the fringe projection profilometry based on the principle of structured light and triangulation is one of the most practical technologies. The three-dimensional measurement generally needs to go through three processes, which are phase recovery, phase unwrapping, and phase-to-height mapping. [0003] Among phase recovery techniques, the two most commonly used methods are Fourier profilometry and phase-shift profilometry. Fourier profilometry only needs a fringe pattern to extract the phase, but this method is affected by...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01B11/25
CPCG01B11/25G01B11/2527G06N3/08G06N3/048G06N3/045G06N3/049
Inventor 左超尹维陈钱冯世杰孙佳嵩陶天阳胡岩张良
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products