Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Lost re-examination method for weak moving target tracking based on ncc matching frame difference

A target tracking and moving target technology, which is applied in the field of weak and small moving target tracking loss and re-examination based on NCC matching frame difference, can solve the problems of background clutter, target continuous tracking failure, etc., achieve high time efficiency, effective and fast tracking, and reduce artificial Participation effect

Active Publication Date: 2020-03-17
BEIJING RACOBIT ELECTRONIC INFORMATION TECH CO LTD
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] During the target tracking process, background clutter, light and dark changes, partial or complete occlusion, target attitude changes, and rapid target movement in the video image will all lead to the failure of continuous target tracking.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Lost re-examination method for weak moving target tracking based on ncc matching frame difference
  • Lost re-examination method for weak moving target tracking based on ncc matching frame difference
  • Lost re-examination method for weak moving target tracking based on ncc matching frame difference

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] Embodiment 1. A weak and small moving target tracking loss re-examination method based on NCC matching frame difference. The process is as follows figure 1 shown, including the following steps:

[0040] S1: Collect and obtain video data, the video data is composed of continuous frame images; in this embodiment, the drone is used to shoot autonomously, the image size of the collected video is color data of 720×1280 pixels, and the flying height of the drone is 103 meters. The video frame rate is 100 frames per second. In this embodiment, for the target to be tracked and the video data to be processed, such as figure 2 Shown in (a) and (b).

[0041] S2: Use the target tracking algorithm to track the target on the continuous frame images. For the continuous multi-frame images that track the target, calculate the multi-dimensional features of the target area tracked in each frame image as normal features, and calculate the normal feature of each frame. The offset relati...

Embodiment 2

[0059] Embodiment 2, in the technical solution as described in Embodiment 1, the multi-dimensional features used in this embodiment include length, width, aspect ratio, duty cycle, area of ​​the smallest circumscribed rectangle, space expansion, and compactness and symmetry.

[0060] The length, width and aspect ratio are the length, width and aspect ratio of the target region or the candidate region.

[0061] The duty cycle is the ratio of the area of ​​the target area or the candidate area to the area of ​​the smallest bounding rectangle.

[0062] The area of ​​the minimum bounding rectangle is the area of ​​the minimum bounding rectangle of the target area or the candidate area.

[0063] The spatial expansion degree is the sum of the distances from all points in the target area or the candidate area to the main axis of the area normalized by the length of the main axis.

[0064] Compactness is the degree to which the shape of the target area or the candidate area deviates...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a weak moving target tracking loss reinspection method based on NCC match frame difference. The method includes the following steps of acquiring the video data; performing target tracking of continuous frames of images, and for each frame in the continuous N frames of images of the tracked image, calculating the multidimensional characteristics of the target area to be the normal characteristics and calculating the offset amount of each characteristic with respect to the mean value to be the normal offset amount; when the target in the fth frame of image is lost, intercepting the reference image in the fth frame of image, intercepting the larger image to be matched in the f+kth frame of image, and calculating the cross correlation coefficient matrix; interrupting the area of the set size by taking the position with the largest cross correlation coefficient as the center, obtaining the candidate region by the frame difference method, extracting the multidimensional characteristics of the candidate region, if the deviation between the offset amount of each feature and the normal offset amount is within the set range, taking the candidate region as the target area; otherwise, k automatically increases k, repeating the above processes. The method can prevent the error detection in the traditional method.

Description

technical field [0001] The invention belongs to the research field of target detection and tracking, and specifically relates to a method for tracking loss and re-examination of weak and small moving targets based on NCC matching frame difference. Background technique [0002] Target tracking is the process of continuously detecting targets in different video frame images, which is a very important content in computer vision research. According to the relationship between the camera and the moving target, target detection can be divided into target detection under static background and dynamic background. For the detection of moving objects when the background is not moving, there are mainly background subtraction method, frame difference method and optical flow method. For the target detection of background motion, the global motion parameters between adjacent frames of the video image are often obtained through the global motion compensation method, and the current image ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/246G06T7/62
CPCG06T2207/10016G06T2207/30232
Inventor 曾大治梁若飞章菲菲陈宇翔
Owner BEIJING RACOBIT ELECTRONIC INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products