Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Divergence ratio distance mapping camera

a technology of distance mapping and divergence ratio, which is applied in the field of three-dimensional information detection and mapping methods, can solve the problems of limited triangulation methods, limited time of flight methods, and inability to provide real-time operation of television cameras, and achieve the effect of minimizing the effect of light source shadowing

Inactive Publication Date: 2008-09-25
IIZUKA KEIGO
View PDF5 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0010]In accordance with the method of the invention, the one or more objects are illuminated using another pair of additional light sources for reduction of the impact of shadow on the measurement of light intensity.
[0014]In yet another aspect of the present invention, a distance mapping system is provided wherein to minimize the effect of light source shadowing the system further comprises an additional pair of light sources.

Problems solved by technology

This triangulation method is limited in that it is too slow and generally can not provide for the real-time operation of a television camera.
While depth resolution can be within micrometers, time of flight methods can be limited to the order of minutes in providing a depth map of a target object.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Divergence ratio distance mapping camera
  • Divergence ratio distance mapping camera
  • Divergence ratio distance mapping camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029]FIG. 1a illustrates the distance mapping apparatus 1 capturing an image I1 of a target object 3 using a first illuminating device 5 as a light source. The first illuminating device 5 illuminates the target object 3 and the camera device 7 captures an image I1 that is stored by the system (see FIG. 6).

[0030]FIG. 1b illustrates the distance mapping apparatus 1 capturing an image I2 of a target object 3 using a second illuminating device 9 as a light source. The second illuminating device 9 illuminates the target object 3 and the camera device 7 captures an image I2 that is stored by the system (see FIG. 6).

[0031]FIG. 1c illustrates the amplitude ratio between I1 and I2. As further explained (see FIG. 2), through the derivation of the equation to calculate distance, the present invention functions by comparing the relative image intensities between I1 and I2 on a pixel by pixel basis. FIG. 1c demonstrates a graph wherein the relative image intensities between I1 and I2 have been ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

PropertyMeasurementUnit
distanceaaaaaaaaaa
light intensityaaaaaaaaaa
distancesaaaaaaaaaa
Login to View More

Abstract

The present invention relates to a method and system for detecting and mapping three-dimensional information pertaining to one or more target objects. More particularly, the invention consists of selecting one or more target objects, illuminating the one or more target objects using a first light source and capturing an image of the one or more target objects, then, illuminating the same one or more target objects using a second light source and capturing an image of the one or more target objects and lastly calculating the distance at the midpoint between the two light sources and the one or more target objects based on the decay of intensities of light over distance by analyzing the ratio of the image intensities on a pixel by pixel basis.

Description

FIELD OF THE INVENTION[0001]The present invention relates to a method and system for detecting and mapping three-dimensional information pertaining to an object. In particular, the invention relates to a method and system that makes use of the divergence of light over distance as a means of determining distance.BACKGROUND OF THE INVENTION[0002]Distance mapping or depth mapping cameras have become ubiquitous in numerous fields such as robotics, machine vision for acquiring three-dimensional (3D) information about objects, intelligent transport systems for assisting driver safety and navigation, bioscience for detecting 3D laparoscopic images of internal organs, non-contact fingerprinting, and image manipulation in movie or television studios.[0003]To achieve the goal of distance mapping an object in order to acquire its 3D information, numerous methods have been developed. The triangulation method uses two or more images taken by strategically placed cameras to calculate the position...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G01N21/00G01S17/89
CPCG01S17/08G01S17/89G06T7/0022G06T2207/10016H04N13/0253G06T7/0073H04N13/0207G06T7/586H04N13/207H04N13/254
Inventor IIZUKA, KEIGO
Owner IIZUKA KEIGO
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products