Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Quantitative photoacoustic imaging method on basis of deep neural networks

A technology of deep neural network and photoacoustic imaging, which is applied in the field of quantitative photoacoustic imaging based on deep neural network, can solve the problems of error, large-scale high-resolution image reconstruction and large amount of calculation, and achieve high accuracy, fast image reconstruction, easily optimized effects

Active Publication Date: 2018-07-24
TSINGHUA UNIV
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The traditional QPAI method ignores the influence of wavelength on luminous flux, and uses linear fitting to estimate SO 2 (ie linear unmixing, linear unmixing) will bring serious errors
Most luminous flux correction methods rely on strong assumptions, such as optical properties have piecewise constant characteristics, scattering parameters need to be known in advance, background optical parameters are required to be uniform (and known), etc., or reconstruction calculations for large-scale high-resolution images too much

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Quantitative photoacoustic imaging method on basis of deep neural networks
  • Quantitative photoacoustic imaging method on basis of deep neural networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0012] The present invention will be described in detail below in conjunction with the accompanying drawings. However, it should be understood that the accompanying drawings are provided only for better understanding of the present invention, and they should not be construed as limiting the present invention.

[0013] Deep learning (DL) has attracted attention in many fields, including medical imaging. Using a special deep neural network (DNN) to represent nonlinear mappings, and adjusting network weights through large training data, DL can automatically detect features from measurement data, and use the mined features to predict target data or perform decision making.

[0014] Convolutional neural network (CNN) has superior image processing performance, and the neural layer of CNN can filter input data and extract useful information. U-net is a fully convolutional neural network. U-net consists of a contraction path (to capture environmental information) and a symmetrical ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a quantitative photoacoustic imaging method on the basis of deep neural networks. The quantitative photoacoustic imaging method is characterized by comprising contents of constructing deep neural network frameworks, namely ResU-net; training the ResU-net by the aid of initial acoustic pressure images inputted at different wavelengths and corresponding quantitative images;carrying out quantitative photoacoustic imaging on multi-wavelength initial acoustic pressure images by the aid of trained ResU-net and outputting quantitative images. The quantitative photoacoustic imaging method has the advantages that residual learning mechanisms are used for the quantitative photoacoustic imaging deep neural networks, namely the ResU-net, accordingly, networks are easy to optimize and can reach considerable depths, and the quantitative photoacoustic imaging method is high in accuracy; furthermore, contraction paths and expansion paths are set for the deep neural networks,accordingly, comprehensive environment information of various resolution levels can be extracted from the inputted multi-wavelength initial acoustic pressure images by the ResU-net, and the high-resolution quantitative images can be ultimately outputted.

Description

technical field [0001] The invention relates to a photoacoustic imaging method, in particular to a quantitative photoacoustic imaging method based on a deep neural network. Background technique [0002] Photoacoustic (PA) imaging can achieve better spatial resolution and good specificity. Quantitative photoacoustic imaging (quantitative PA imaging, QPAI) can convert multispectral PA images into a series of precise images: such as concentration images of specific molecular markers for molecular imaging, assessment of tumor growth, metabolism, and response to various treatments. Resistance blood oxygen saturation chart, etc. [0003] The traditional QPAI method ignores the influence of wavelength on luminous flux, and uses linear fitting to estimate SO 2 (ie linear unmixing, linear unmixing) will bring serious errors. Most luminous flux correction methods rely on strong assumptions, such as optical properties have piecewise constant characteristics, scattering parameters ne...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): A61B5/00G06N3/08G06N3/04
CPCA61B5/0095G06N3/08G06N3/045
Inventor 罗建文蔡创坚马骋
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products