Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-camera data fusion method in monitoring system

A data fusion and monitoring system technology, applied in the field of computer vision, can solve problems such as the inability to monitor screen fusion, and achieve the effects of improving wide applicability, improving imaging accuracy, and improving formation efficiency

Pending Publication Date: 2020-03-06
成都威爱新经济技术研究院有限公司
View PDF7 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the above-mentioned deficiencies in the prior art, the multi-camera data fusion method in the monitoring system provided by the present invention solves the problem that in the existing monitoring system, it is impossible to fuse the monitoring pictures with overlapping areas in real time, and it is difficult to form a corresponding monitoring target. Problems with tracking fusion footage

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-camera data fusion method in monitoring system
  • Multi-camera data fusion method in monitoring system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0050] Such as figure 1 As shown, a multi-camera data fusion method in a monitoring system includes the following steps:

[0051] S1. Obtain monitoring images of several cameras with overlapping monitoring areas, and perform preprocessing on them to form standard monitoring images;

[0052]S2. Determine the fusion requirements of the surveillance images;

[0053] If it is fusion to form a panoramic image, then enter step S3;

[0054] If the object tracking image is formed for fusion, then enter step S4;

[0055] S3. According to the relevance of the standard monitoring images, they are fused to form a corresponding panoramic image, so as to realize the data fusion of multiple cameras;

[0056] S4. Determine the tracking object in the standard monitoring image, and extract all the standard monitoring images with the tracking object, and enter step S5;

[0057] S5. Arranging the extracted standard monitoring images according to the chronological sequence of the tracking obje...

Embodiment 2

[0059] The preprocessing of the monitoring image in step S1 of the above-mentioned embodiment 1 includes sequentially performing size standardization processing, grayscale processing, binarization processing and image denoising processing on the monitoring image. Among them, the size of the monitoring image is standardized to 512×512 pixels, which is convenient for subsequent image data processing and fusion; the grayscale image obtained by grayscale processing is also called grayscale image, and each pixel in the image can be represented by 0( The brightness value (Intensity) from black) to 255 (white) means that each pixel of the grayscale image only needs one byte to store the grayscale value (also known as intensity value, brightness value), and the grayscale range is 0-255. The color of each pixel in a color image is determined by three components of R, G, and B, and each component has 255 values, so that a pixel can have a color range of more than 16 million (255,255,255)...

Embodiment 3

[0061] Such as figure 2 As shown, the step S3 of the above-mentioned embodiment 1 is specifically:

[0062] S31. Determine each standard monitoring image S according to the relative positional relationship between each standard monitoring image i The corresponding set of images to be fused T i ;

[0063] In the formula, the subscript i is the label of the standard monitoring image, and i=1,2,3,...,I, I is the standard monitoring image S i the total number of T i ={T 1 , T 2 ,...,T n ,...,T N}, n is required and standard monitoring image S i The overlapping image labels for fusion, n=1, 2, 3,..., N, N is the required and standard monitoring image S i The total number of standard surveillance images for fusion;

[0064] S32. Calculate the standard monitoring image S sequentially i and the corresponding set of images to be fused T i The feature point matching pair set K of each overlapping image in i ;

[0065] S33. Match the set K according to the feature points ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-camera data fusion method in a monitoring system, and the method comprises the steps: S1, carrying out the preprocessing of a monitoring image with an overlapped monitoring region, and forming a standard monitoring image; S2, determining a fusion demand of the monitoring image; if the demand is fusing to form a panoramic image, entering S3; if the demand is fusing to form an object tracking image, entering S4; S3, according to the relevance of the standard monitoring images, fusing the standard monitoring images to form corresponding panoramic images; S4, determining a tracking object, and extracting all standard monitoring images with the tracking object; and S5, arranging the extracted standard monitoring images according to the appearing time sequence ofthe tracking object, and fusing to form an object tracking image. The multi-camera data fusion method suitable for different scenes is provided for presentation of panoramic images and tracking of specific objects in an existing monitoring system, the calculated amount during image data fusion is reduced, and the image data fusion efficiency and the presentation effect of fused images are improved.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to a multi-camera data fusion method in a monitoring system. Background technique [0002] With the increase in the number of camera installations and the importance of public safety issues, artificial video surveillance methods are far from meeting the current security needs, so intelligent surveillance technology has been more and more widely used. [0003] In recent years, more and more intelligent monitoring systems have been applied in commercial, legal and military fields. Visual monitoring of dynamic scenes has become a cutting-edge research direction of computer vision, with broad application prospects and potential economic value. With the rapid development of modern technology, the price of cameras is getting cheaper and cheaper. In most cases, due to the limited field of view (FOV, Field of View) of a single camera, and the presence of occlusions in r...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50G06T5/00
CPCG06T5/50G06T2207/20221G06T5/70
Inventor 吕云
Owner 成都威爱新经济技术研究院有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products