Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Systems and methods for compression of three dimensional depth sensing

A depth and depth map technology, applied in the field of compressed 3D depth sensing, can solve the problems of resource occupation and time spent on 3D depth map.

Pending Publication Date: 2019-10-11
ANALOG DEVICES INT UNLTD
View PDF18 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

3D depth maps take time and can take up significant resources such as light power and processing resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Systems and methods for compression of three dimensional depth sensing
  • Systems and methods for compression of three dimensional depth sensing
  • Systems and methods for compression of three dimensional depth sensing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach 1

[0046] Implementation 1: Combining 2D Image Segmentation with 3D Depth Sensing

[0047] Aspects of the embodiments relate to systems and methods for utilizing two-dimensional (2D) image data to increase sensing efficiency for acquiring three-dimensional (3D) depth points. In implementations, segmentation can be used to identify regions of interest (ROIs) from our target 2D scene, to obtain depth information or to reduce 3D scans of certain areas.

[0048] Figure 3A is a schematic illustration of an image 300 captured by a 2D imaging device according to an embodiment of the disclosure. In image 300, four "objects" are identified: wall 302, cylinder 304, cuboid 306, and cone 308. Image 300 may be captured by a conventional 2D imaging device, such as 2D imaging device 140a or 140b. Figure 3B is a schematic illustration of a segmented 2D image 350 according to an embodiment of the disclosure. After segmentation (or not identified as a target region), wall 352 is shown with...

Embodiment approach 2

[0072] Embodiment 2: Multi-resolution imaging for compressed 3D depth sensing

[0073] Thoroughly scanning the entire image with an active laser takes time and power. Usually scenes are sparse, meaning that most of the information is redundant, especially for pixels that are adjacent to each other, i.e. neighboring pixels. In an embodiment, the scene may be scanned with a depth sensor using a coarse spatial resolution. Depth information can be extracted, as well as the "depth complexity" of each pixel. The region covered by each pixel can be revisited using a finer resolution, depending on the pixel's "depth complexity" (indicating how much 'depth' is involved) and the region's correlation (e.g. based on any one or a combination of prior knowledge of features of a scene or object). Additional factors include the results of previous segmentation of 2D images of the scene, changes observed in 2D images from previous snapshots of the same scene, and / or specific applications (e...

Embodiment approach 3

[0082] Implementation 3: Applying super-resolution to 3D sensing

[0083]In 3D sensing, the pixel resolution may be too coarse due to the area illuminated by the laser beam. The area depends on the system (e.g. collimator), but also on the scene (the same beam covers less area for closer objects than for distant objects). Since some regions are broad and poorly defined (often following a Gaussian illumination pattern), relative overlap between adjacent regions is expected. Higher overlaps can be used to obtain higher resolution images with super-resolution techniques.

[0084] When a pixel is illuminated (and then receives reflected light from) the light distribution from the received reflected light signal is not completely uniform. The overlap between nearby pixels can allow higher resolution depth information to be inferred.

[0085] Figure 11 Three depth signals representing adjacent pixel values ​​are shown according to an embodiment of the disclosure. Finer resolut...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Aspects of the embodiments are directed to time-of-flight (ToF) imaging systems and method for image processing. The ToF imaging system can include a depth sensor; a light steering device; a photodetector; and an image processor. The ToF imaging system can be configured to acquiring a first image of a scene by the photodetector; identifying one or more regions of interest of the scene from the first image; and capturing a depth map of at least one of the one or more regions of interest.

Description

technical field [0001] The present disclosure relates to systems and methods for compressed three-dimensional depth sensing. Background technique [0002] Depth-sensing imaging systems can use coherent light sources and lightweight steering to illuminate a scene for depth estimation. Three-dimensional depth maps take time and can take up significant resources, such as lighting power and processing resources. Contents of the invention [0003] Aspects of the embodiments relate to time-of-flight (ToF) imaging systems and methods of operation thereof. For example, a method of operating a ToF imaging system may include: acquiring a first image of a scene; identifying one or more target regions of the scene from the first image; and capturing at least one of the one or more target regions depth map. [0004] A time-of-flight imaging system may include: a depth sensor; a light steering device; a photodetector; and an image processor. A time-of-flight imaging system configure...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01S17/89G01S17/86
CPCG01S17/42G01S7/4808G01S7/4814G01S7/4868G01S7/4817G01S17/89G01S17/86H04N13/271H04N13/296H04N2013/0092H04N2213/003
Inventor J·卡尔皮玛拉维拉E·英格利什M·泽基尼王超
Owner ANALOG DEVICES INT UNLTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products