Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multiscale deep reinforcement machine learning for n-dimensional segmentation in medical imaging

A technology of machine learning and machine learning models, applied in the field of multi-scale deep reinforcement machine learning for N-dimensional segmentation in medical imaging

Active Publication Date: 2019-11-19
SIEMENS HEALTHCARE GMBH
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These methods can suffer from several limitations such as suboptimal local convergence and limited scalability
For high-resolution volumetric data, the use of scan patterns for boundary fitting leads to significant computational challenges

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multiscale deep reinforcement machine learning for n-dimensional segmentation in medical imaging
  • Multiscale deep reinforcement machine learning for n-dimensional segmentation in medical imaging
  • Multiscale deep reinforcement machine learning for n-dimensional segmentation in medical imaging

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0014] Multiscale deep reinforcement learning generates multiscale deep reinforcement models for N-dimensional (e.g. 3D) segmentation of objects, where N is an integer greater than 1. In this context, segmentation is formulated as learning an image-driven policy for shape evolution that converges to object boundaries. This segmentation is treated as a reinforcement learning problem, and scale-space theory is used to achieve robust and efficient multi-scale shape estimation. The learning challenge of end-to-end regression systems can be addressed by learning an iterative strategy to find segmentations.

[0015] Although trained as a full segmentation method, the trained policy can be used instead or also for shape refinement as a post-processing step. Any segmentation method provides an initial segmentation. Assuming the original segmentation is used as the initial segmentation in a multi-scale deep reinforcement machine learning model, a machine learning strategy is used to ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Multiscale deep reinforcement machine learning for N-dimensional segmentation in medical imaging. Multiscale deep reinforcement learning produces multiscale deep reinforcement models for multidimensional (e.g., 3D) segmentation of objects (22). In this context, segmentation is formulated as learning an image-driven policy (38) for shape evolution (40) that converges to object boundaries. This segmentation is treated as a reinforcement learning problem, and scale-space theory is used to achieve robust and efficient multi-scale shape estimation. The learning challenge of end-to-end regression systems can be addressed by learning an iterative strategy to find segmentations.

Description

[0001] related application [0002] This patent document claims the benefit of the filing date of Provisional US Patent Application Serial No. 62 / 500,604, filed May 3, 2017, under 35 U.S.C. § 119(e), which is hereby incorporated by reference. Background technique [0003] This embodiment relates to segmentation in medical imaging. Accurate and fast segmentation of anatomical structures is a fundamental task in medical image analysis enabling real-time guidance, quantification, and processing for diagnostic and interventional procedures. Previous solutions for 3D segmentation are based on machine learning-driven active shape models, forward propagation theory, Markov random field methods, or deep image-to-image regression models. [0004] Active shape models and forward-propagation theoretical solutions propose a parametric surface model that is deformed to fit the boundaries of the target object. Machine learning techniques utilize image databases to learn complex parametric...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/11G06T15/00G06N20/00
CPCG06T7/11G06T15/005G06T2207/20081G06T2207/20116G06T2207/20124G06T7/187G06T2207/10072G06T2207/20084G06T7/149G06N20/00G16H40/60G16H50/70G06T7/12G16H30/40G06N7/01G06N3/08G06T7/13G06T2207/10028G06T2207/30004G06N3/047
Inventor D.科马尼丘B.乔治斯库F.C.盖苏
Owner SIEMENS HEALTHCARE GMBH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products