Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Self-adaptation target tracking method based on vision saliency characteristics

A target tracking and adaptive technology, applied in the field of computer vision, can solve problems such as inability to track targets effectively, achieve the effects of good real-time performance, improve stability, and reduce tracking errors

Inactive Publication Date: 2015-03-25
NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
View PDF2 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This technology solves problems with current methods for detecting targets by combining data from different sources like cameras or radar sensors. These techniques help identify specific objects that may have been hidden behind other things during an event such as accidental damage caused due to firefighting equipment being used on fires at nearby locations. Additionally, this method allows accurate detection even if there are small variations between similar areas where no object was detected beforehand.

Problems solved by technology

Technological Problem addressed by this patented technology relates to improving object tracking speed without being affected by environmental factors such as light or shade variations during image capture due to camera movement. Current methods have limitations including poor visibility caused by clothing, lack of focus ability from surrounding areas, noisy environment, etc., making them difficult to achieve accurate tracking over dynamic environments like videos captured through cameras.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Self-adaptation target tracking method based on vision saliency characteristics
  • Self-adaptation target tracking method based on vision saliency characteristics
  • Self-adaptation target tracking method based on vision saliency characteristics

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The present invention will be further described below in conjunction with the accompanying drawings.

[0035] The principle of the method of the present invention is to detect the target visual saliency map based on the principle of frequency domain filtering, describe the target model in combination with color features and visual saliency features, and adjust the transfer vector fusion weight adaptively according to the size of the similarity coefficient to realize complex Accuracy of object tracking against background. The visual saliency feature can effectively enhance the target and suppress the interference, and integrate the features of the visual saliency region, so as to strengthen the information description of the target in the candidate area and weaken the interference of background information. Using the visual attention mechanism to effectively screen out salient information and provide it to the target tracking method can improve the efficiency of informat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a self-adaptation target tracking method based on vision saliency characteristics. The method is characterized by comprising the following steps of (1) building a color image quaternion model, extracting and processing characteristic information of four channels of the image, and detecting an image vision saliency map; (2) on the basis of the detected vision saliency map, extracting a vision saliency characteristic kernel histogram, and extracting the similarities among Bhattacharyya coefficient matric characteristics; (3) determining a characteristic fusion strategy, and calculating self-adaptation fusion weights of image color characteristics and vision saliency characteristics; (4) on the basis of Bhattacharyya coefficients, determining target center position transfer vector weights, and carrying out iterative optimization to obtain the final target center position. According to the self-adaptation target tracking method, background fusion interference can be effectively overcome, and robustness is achieved for target portion shielding.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Owner NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products