Multiscale weighted matching and sensor fusion for dynamic vision sensor tracking

A visual sensor and dynamic technology, applied in the direction of instruments, character and pattern recognition, image data processing, etc., can solve problems such as difficult to cross-check camera motion or posture, difficult image matching, lack of key features, etc.

Active Publication Date: 2018-06-26
SAMSUNG ELECTRONICS CO LTD
View PDF4 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The main difficulties related to DVS camera motion or tracking include: (1) features within each DVS frame are sparse and highly variable, thus feature-based image matching becomes difficult, (if even possible) leading to motion estimation Accuracy suffers; (2) Due to the lack of extraction of key features, the corresponding landmarks are unavailable due to DVS motion
Therefore, it may be difficult to cross-check current estimates of camera motion or pose, and it may be difficult to reference landmarks to reduce sensor motion estimate offsets

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multiscale weighted matching and sensor fusion for dynamic vision sensor tracking
  • Multiscale weighted matching and sensor fusion for dynamic vision sensor tracking
  • Multiscale weighted matching and sensor fusion for dynamic vision sensor tracking

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. However, it will be understood by those skilled in the art that the disclosed aspects may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to obscure the subject matter disclosed herein.

[0019] Throughout this specification, reference to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment disclosed herein. Thus, appearances of the phrases "in one embodiment" or "in an embodiment" or "according to an embodiment" (or other phrases of similar meaning) in various places throughout this specification are not necessarily all referring to the same implementation. example. Further...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a multiscale weighted matching and sensor fusion for dynamic vision sensor (DVS) tracking. A dynamic vision sensor pose-estimation system includes a DVS, a transformation estimator, an inertial measurement unit (IMU) and a camera-pose estimator based on sensor fusion. The DVS detects DVS events and shapes frames based on a number of accumulated DVS events. The transformation estimator estimates a 3D transformation of the DVS camera based on an estimated depth and matches confidence-level values within a camera-projection model such that at least one of a plurality of DVS events detected during a first frame corresponds to a DVS event detected during a second subsequent frame. The IMU detects inertial movements of the DVS with respect to world coordinates between the first and second frames. The camera-pose estimator combines information from a change in a pose of the camera-projection model between the first frame and the second frame based on the estimated transformation and the detected inertial movements of the DVS.

Description

[0001] This patent application claims the benefit of priority to US Provisional Patent Application No. 62 / 437,027, filed December 20, 2016, the disclosure of which is hereby incorporated by reference in its entirety. technical field [0002] The subject matter disclosed herein relates generally to dynamic vision sensors (DVS), and more particularly, to an apparatus and method for estimating the pose of a DVS. Background technique [0003] The output of the DVS is the event-based change in brightness sensed by the camera. Typically, the output of a DVS is a stream of events in which each event is associated with a specific state (ie, the event location within the image sensor array and a binary state indicating a positive or negative change in brightness). A certain number of DVS events are sampled to form an image in which pixel locations containing one or more events are set to non-zero and other pixel locations are all set to zero. The value of each non-zero pixel can be ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/46G06K9/62
CPCG06V10/462G06F18/25G06T7/74G06T2207/30244G06T2207/10028G06T7/277G06T7/269
Inventor 冀正平石立龙王一兵柳贤锡伊利亚·奥夫桑尼科夫
Owner SAMSUNG ELECTRONICS CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products