Visual tracking method for following robot

A technology of following robots and visual tracking, applied in instruments, image data processing, computing, etc., can solve the problems of low tracking accuracy and large amount of calculation, and achieve the effect of enhancing stability and smoothing changes

Inactive Publication Date: 2018-04-13
ZHEJIANG UNIV OF TECH
View PDF5 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to overcome the shortcomings of existing visual tracking methods, such as large amount of algorithm calculation and low tracking accuracy, the present invention provides a visual tracking method for following robots that effectively improves tracking accuracy and algorithm real-time performance under the premise of ensuring tracking stability.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual tracking method for following robot
  • Visual tracking method for following robot
  • Visual tracking method for following robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention will be further described below in conjunction with the accompanying drawings.

[0028] refer to Figure 1 to Figure 5 , a kind of visual tracking method for following robot, described method comprises the following steps:

[0029] Step 1) Fusion the first frame of image information and depth information and calculate the centroid of the user, and compare the position of the centroid with the depth values ​​of surrounding pixels to calculate the range belonging to the user in the depth map, which is the target user template;

[0030] Step 2) Establish the color probability statistical model of the target user template and the candidate user template in the next frame, use the similarity function to measure, and continuously iterate through the meanshift algorithm, the area with the highest similarity coefficient is the best candidate user in the frame template;

[0031] Step 3) By comparing the depth values ​​of the center point of the best candi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A visual tracking method for a following robot includes following steps: (1) extracting a tracking template; (2) carrying out iteration according to a mean shift algorithm to obtain the position of atracking target; (3) adaptively processing a window according to a depth map; (4) adjusting the tracking window according to Kalman filtering and a similarity function; and (5) calculating the center-of-mass coordinates of the target and transforming the center-of-mass coordinates to a camera coordinate system. According to the invention, first, a target user template is obtained, then, iterationis carried out according to a mean shift algorithm to obtain the position of the target user, next, the tracking window is adaptively processed according to a depth map and the center of mass of the target user is calculated, and finally, the position of the user relative to a robot is calculated for tracking. The designed visual tracking method can be used in a complex environment, and has high tracking precision.

Description

technical field [0001] The invention relates to the field of intelligent mobile robots, in particular to a visual tracking method for mobile robots. Background technique [0002] In the field of intelligent robots, human-machine collaborative robots have great potential for development. By introducing effective human-machine cooperation, mobile robots can improve their adaptability to complex environments, thereby completing some complex tasks, especially those often encountered in life. In the case of moving items, the usual solution is to carry them by accompanying personnel, or carry them in batches. This method has the disadvantages of high labor intensity, high cost and low efficiency. [0003] In recent years, following robots have begun to appear in our lives one after another. For example, in airports, robots can be used to help the elderly carry luggage; in warehouses, staff do not need to have the skills to drive forklifts, simple human-computer interaction can g...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/277G06T7/246G06T7/90G06T7/50
CPCG06T2207/10024G06T2207/10028G06T2207/20021G06T7/251G06T7/277G06T7/50G06T7/90
Inventor 俞立何佳燊杨旭升王瑶为王亚男
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products