Moving object segmentation using depth images

A technology of depth images and moving objects, applied in image analysis, image enhancement, image data processing, etc., can solve problems such as complex algorithm calculation

Active Publication Date: 2012-09-12
MICROSOFT TECH LICENSING LLC
View PDF1 Cites 30 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these algorithms are computationally complex and thus require significant c

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Moving object segmentation using depth images
  • Moving object segmentation using depth images
  • Moving object segmentation using depth images

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The detailed description provided below in connection with the accompanying drawings is intended as a description of examples of the invention and is not intended to represent the only forms in which examples of the invention may be constructed or used. This description sets forth the functionality of an example of the invention, and a sequence of steps for building and operating the example of the invention. However, the same or equivalent functions and sequences can be implemented by different examples.

[0019] Although examples of the present invention are described and shown herein as being implemented in a computer gaming system, the described system is provided by way of example only, and not limitation. Those skilled in the art will appreciate that the present example is suitable for application in various different types of computing systems that use 3D models.

[0020] figure 1 is a schematic diagram of a person 100 standing in a room and holding a mobile de...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Moving object segmentation using depth images is described. In an example, a moving object is segmented from the background of a depth image of a scene received from a mobile depth camera. A previous depth image of the scene is retrieved, and compared to the current depth image using an iterative closest point algorithm. The iterative closest point algorithm includes a determination of a set of points that correspond between the current depth image and the previous depth image. During the determination of the set of points, one or more outlying points are detected that do not correspond between the two depth images, and the image elements at these outlying points are labeled as belonging to the moving object. In examples, the iterative closest point algorithm is executed as part of an algorithm for tracking the mobile depth camera, and hence the segmentation does not add substantial additional computational complexity.

Description

technical field [0001] The present invention relates to object segmentation, and more particularly to moving object segmentation using depth images. Background technique [0002] Three-dimensional computer models of real-world environments are useful in a variety of applications. For example, such models can be used in applications such as immersive gaming, augmented reality, architecture / planning, robotics, and engineering prototyping. Depth cameras (also known as z-cameras) can generate real-time depth maps of real-world environments. Each pixel in these depth maps corresponds to a discrete distance measurement captured by the camera from a 3D point in the environment. This means that these cameras provide a depth map consisting of an unsorted set of points (called a point cloud) at a real-time rate. [0003] In addition to creating a depth map representation of the real world environment, it is useful to be able to perform segmentation operations that distinguish indiv...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/00
CPCG06T7/215G06T7/194G06T2207/10028G06T2207/30244
Inventor R·纽科姆S·伊扎迪O·希利格斯D·金D·莫利尼奥克斯J·D·J·肖顿P·科利A·费茨吉本S·E·豪杰斯D·A·巴特勒
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products