Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fusion method and device for adaptive video sensors

A technology of video sensor and fusion method, applied in the direction of TV, color TV, closed-circuit TV system, etc., can solve the problems of unreachable sensor reliability function, inaccurate fusion results, not considering the reliability of input sensors, etc. The effect of reducing target position error and improving accuracy

Inactive Publication Date: 2016-11-09
BEIJING UNIV OF POSTS & TELECOMM
View PDF4 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The above fusion methods are rarely applied to the actual video sensor fusion, and the reliability of the input sensor is not considered, so that a faulty sensor may affect the fusion process, resulting in inaccurate fusion results
However, the existing sensor reliability function can not meet the requirements of most practical video surveillance systems, such as real-time, easy to implement, etc.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fusion method and device for adaptive video sensors
  • Fusion method and device for adaptive video sensors
  • Fusion method and device for adaptive video sensors

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. The components of the embodiments of the invention generally described and illustrated in the figures herein may be arranged and designed in a variety of different configurations. Accordingly, the following detailed description of the embodiments of the invention provided in the accompanying drawings is not intended to limit the scope of the claimed invention, but merely represents selected embodiments of the invention. Based on the embodiments of the present invention, all other embodiments obtained by those skilled in the art without making creative efforts belong to the protection scope of the present invention.

[0026] It should be noted that like numerals and let...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Embodiments of the present invention propose an adaptive video sensor fusion method and device. The method includes: acquiring video information of a moving target captured by different video sensors, calculating the tracking accuracy of the moving target as the reliability of the different video sensors, and obtaining position information of the moving target tracked by different video sensors, said The position information includes the position of the moving target; the position of the moving target obtained from different sensors is mapped to the top view of the monitoring scene by using homography transformation; the position information of the moving target from different sensors is based on the different video sensors The credibility of the moving target is filtered by a local adaptive filter to obtain the optimized position and state estimation of the moving target, and the position and state estimation of the moving target optimized by different sensors are fused. The embodiments of the present invention have the feature of improving the fusion accuracy of video data from multiple video sensors.

Description

technical field [0001] The invention relates to the field of multi-sensor data fusion, in particular to an adaptive video sensor fusion method and device. Background technique [0002] With the development of multi-sensor systems and communication technologies, sensor fusion is applied at different levels and in different approaches. Signal-level fusion is the lowest level of fusion. In the field of video surveillance, it can be used to combine images acquired by homogeneous or heterogeneous sensors in the same scene. Before actually performing the fusion method, the images to be fused need to be correctly spatially registered. Signal-level fusion is the most restrictive, as it can only fuse image / video-like sensors that output images or video. [0003] Compared with the limitations of signal-level fusion, feature-level fusion allows the video sensors to be fused to have different fields of view, such as different viewing angles, different distances from the target, etc., ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N7/18H04N5/14H04N5/265
CPCH04N7/18H04N5/145H04N5/265
Inventor 杜军平李清平
Owner BEIJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products