Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

User terminal and object tracking method and device thereof

A target tracking and target technology, which is applied in image analysis, image enhancement, instruments, etc., can solve the problem of tracking target scale changes and real-time tracking, which cannot meet the requirements of real-time tracking, and a single feature cannot adapt to multiple different scenarios. And other issues

Inactive Publication Date: 2017-05-17
SPREADTRUM COMM (TIANJIN) INC
View PDF1 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] a) In the feature extraction step, a single feature such as grayscale, color or HOG is usually used. Since different features have different performances in different scenes, a single feature cannot be adapted to multiple different scenes
[0010] b) At present, some target tracking algorithms such as CSK, MOSSE and KCF can only estimate the position of the target offset, which leads to poor tracking performance of the algorithm when the tracking target has a large-scale change, while other target tracking algorithms can only Estimating scale changes at low frame rates cannot meet the requirements of real-time tracking
That is to say, it is difficult to balance both the scale change of the tracking target and the real-time tracking.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • User terminal and object tracking method and device thereof
  • User terminal and object tracking method and device thereof
  • User terminal and object tracking method and device thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0114] As described below, an embodiment of the present invention provides a method for object tracking.

[0115] refer to figure 1 The flow chart of the target tracking method is shown in detail below through specific steps:

[0116] S101. Train the parameter model of the target according to the image frame A.

[0117] In this embodiment, under the tracking-detection framework, the correlation filter in the KCF algorithm is used, and combined features and scale estimation are added to improve the performance of the algorithm.

[0118] As mentioned earlier, the tracking-detection algorithm consists of two steps, training and detection. Training generally refers to extracting samples according to the target position of the previous frame and then using machine learning algorithms to train the parameter model. The detection is to classify the samples of the current frame according to the parameter model trained in the previous frame, predict the target position of the current f...

Embodiment 2

[0183] As described below, an embodiment of the present invention provides a target tracking device.

[0184] refer to image 3 The structure block diagram of the target tracking device is shown.

[0185] The target tracking device includes: a model training unit 301 and a target prediction unit 302; wherein the main functions of each unit are as follows:

[0186] The model training unit 301 is adapted to train the parameter model of the target according to the image frame A;

[0187] The target prediction unit 302 is adapted to predict the position of the target in the image frame B according to the trained parameter model of the target after the operation performed by the model training unit 301;

[0188] The parameter model of the training target includes: a first parameter model of the training target and a second parameter model of the training target;

[0189] The first parameter model of the training target includes: training the first parameter model of the target t...

Embodiment 3

[0233] As described below, an embodiment of the present invention provides a user terminal.

[0234] The difference from the prior art is that the user terminal further includes the target tracking device provided in the embodiment of the present invention. Therefore, when the user terminal is tracking the target, it uses the correlation filter in the KCF algorithm under the tracking-detection framework, and adds the combination feature composed of FHOG feature and (HSI) color feature to improve the performance of the algorithm. This scheme is a target tracking algorithm that uses a variety of features to express information. It not only has high real-time performance, but also can effectively deal with the adverse effects of complex background, illumination, non-rigid transformation and other unfavorable factors on target tracking.

[0235] In a specific implementation, the user terminal may be a smart phone or a tablet computer.

[0236] Those of ordinary skill in the art c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a user terminal and an object tracking method and device thereof. The method comprises that a parameter model of an object is trained according to a first picture frame; according to the trained parameter model of the object, the position of the object in a second picture frame is predicted; the parameter model of the training object comprises first and second parameter models of the training objects; and the second parameter model of the object is described by combining FHOG and color features of the object. When the object is tracked, a correlation filter in a KCF algorithm is utilized under a tracking-detection frame, and a combination feature formed by the FHOG and color features is added to improve the performance of the algorithm. In the object tracking algorithm, multiple types of features are used to express information, the instantaneity is high, and adverse influence of adverse factors, including complex background, illumination and non-rigid transformation, on object tracking can be handled effectively.

Description

technical field [0001] The present invention relates to the technical field of wireless communication, in particular to a user terminal and its target tracking method and device. Background technique [0002] Intelligent human-computer interaction is the development direction of mobile multimedia applications in the future, and object tracking (tracking) is the basis of intelligent human-computer interaction. There are many object tracking algorithms in the prior art. The so-called target tracking algorithm, given a set of video sequences and the initial position of the target, the target tracking algorithm can automatically locate the position of the target in the video sequence. [0003] There are two types of target tracking algorithms in the prior art: [0004] 1) Tracking-by-detection algorithm (tracking-by-detection) [0005] This type of algorithm generally includes two steps, training and detection. Training generally refers to extracting samples according to the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/20G06T7/292
CPCG06T2207/20081
Inventor 潘博阳陈敏杰刘阳郭春磊林福辉
Owner SPREADTRUM COMM (TIANJIN) INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products