Detecting objects in video data

A technology for video data and objects, which is applied in the field of video processing and can solve the problems of frequent changes in the position and orientation of the shooting device.

Active Publication Date: 2019-04-16
IMPERIAL INNOVATIONS LTD
View PDF11 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For example, non-uniform terrain or a hand-held camera can cause frequent changes in camera position and orientation in areas in which the scene is repeatedly observed and re-observed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Detecting objects in video data
  • Detecting objects in video data
  • Detecting objects in video data

Examples

Experimental program
Comparison scheme
Effect test

example 100

[0052] Example 100 also illustrates various exemplary capture devices 120 that may be used to capture video data associated with a 3D space. The camera 120 may comprise a camera arranged to record data resulting from viewing the 3D space 110 in digital or analog form. In some cases, the photographing device 120 is movable, for example, can be set to photograph different frames corresponding to different viewing portions of the 3D space 110 . The camera 120 is movable relative to the static mount, for example, may include actuators to change the position and / or orientation of the camera relative to the three-dimensional space 110 . In another case, the camera 120 may be a handheld device operated and moved by a human user.

[0053] exist Figure 1A A plurality of camera devices 120 are also shown coupled to a robotic device 130 arranged to move within the 3D space 110 . Robotic devices 135 may include autonomous aerial and / or land mobile devices. In this example 100 , roboti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Certain examples described herein enable semantically-labelled representations of a three-dimensional (3D) space to be generated from video data. In described examples, a 3D representation is a surface element or 'surfel' representation, where the geometry of the space is modelled using a plurality of surfaces that are defined within a 3D co- ordinate system. Object-label probability values for spatial elements of frames of video data may be determined (605) using a two-dimensional image classifier. Surface elements that correspond to the spatial elements are identified (610) based on a projection of the surface element representation using an estimated pose for a frame. Object-label probability values for the surface elements are then updated (615) based on the object-label probability values for corresponding spatial elements. This results in a semantically-labelled 3D surface element representation of objects present in the video data. This data enables computer vision and/or robotic applications to make better use of the 3D representation.

Description

technical field [0001] The present invention relates to video processing. In particular, the invention relates to processing frames of video data and labeling surface elements within a representation of three-dimensional (3D) space. The invention relates particularly, but not exclusively, to generating semantically labeled representations of 3D spaces for use in robotics and / or augmented reality applications. Background technique [0002] In computer vision and robotics, it is often necessary to construct representations of 3D spaces. Building a representation of 3D space allows the mapping of real-world environments into a virtual or digital realm where it can be used and manipulated by electronic devices. For example, a mobile robotic device may require a representation of 3D space to allow simultaneous localization and mapping, and thus navigation of its environment. Alternatively, the representation of the 3D space may enable identification and / or extraction of 3D mod...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06F17/00G06K9/62
CPCG06V20/64G06V20/41G06V10/7715G06F18/213G06T7/00G06V20/647H04N13/183G06N3/08G06V10/28G06F18/2148G06F18/2415
Inventor 约翰·布伦丹·麦科马克安库尔·汉达安德鲁·戴维森斯特凡·莱乌特尼格尔
Owner IMPERIAL INNOVATIONS LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products