Unlock instant, AI-driven research and patent intelligence for your innovation.

A mobile robot automatic following method suitable for multi-person scenarios

A mobile robot, automatic following technology, applied in the direction of instruments, target finding control, non-electric variable control, etc., can solve the problems of poor tracking accuracy, inability to distinguish the following target and interference target, etc., to achieve the effect of improving stability

Active Publication Date: 2022-08-09
QUANZHOU INST OF EQUIP MFG
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, there are many methods to realize the robot following the human leg, among which the robot automatic following method based on single vision and the robot automatic following method based on single laser radar are widely used, and the method based on vision tracking human leg is controlled by the illumination and camera field of view. Influenced by factors, the tracking accuracy is poor; the method of tracking human legs based on lidar, such as the Chinese patent document CN107272680A discloses a method of automatic robot following based on the ROS robot operating system, which cannot distinguish between the following target and the interference target in a multi-person scene

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A mobile robot automatic following method suitable for multi-person scenarios
  • A mobile robot automatic following method suitable for multi-person scenarios
  • A mobile robot automatic following method suitable for multi-person scenarios

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0059] refer to Figure 1 to Figure 3 As shown in the figure, an automatic following method for a mobile robot suitable for a multi-person scene includes the following steps:

[0060] Step S1, start sampling, the laser radar on the robot starts to scan and collect data, and the robot obtains the laser data point information of the legs that follow the target, and then enters step S2, wherein the laser data point information of the legs includes but is not limited to the laser radar scans. Distance information reflected back from occluders.

[0061] Step S2, preprocessing, the robot preprocesses the laser data point information of the legs following the target by filtering the laser data points disturbed in the environment, and then goes to step S3;

[0062] The specific step S2 includes the following steps:

[0063] Step S21, calculate the geometric relationship between the leg diameter and the distance of the following target:

[0064]

[0065] In the formula, θ is the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An automatic following method for a mobile robot suitable for a multi-person scene, comprising the following steps: step S1, start sampling, the robot obtains the laser data point information of the legs of the following target, and then enters step S2; step S2, preprocessing, for the following target The leg laser data point information is preprocessed, then go to step S3; step S3, single-person scene human leg recognition, the robot divides the leg laser data points into legs according to the curvature and total length of the leg laser data point information Cluster leg_cluster and interference information, and then divide the leg cluster leg_cluster into pedestrian cluster human_cluster and interference information according to the distance between adjacent two leg clusters leg_cluster. If the pedestrian cluster human_cluster is obtained, go to step S4; Step S4, In the multi-person scene, the human legs are locked. The robot divides the pedestrian cluster human_cluster into the following target and the interference target according to the pedestrian cluster human_cluster and the change angle of the robot in unit time. If the following target is obtained, go to step S5; step S5, start following .

Description

technical field [0001] The invention relates to an automatic following method for a mobile robot. Background technique [0002] In human-computer interaction scenarios such as airport handling robots, pedestrian tracking shopping carts, and museum guiding robots, mobile robots and their executive components require high-precision motion control. At present, there are many methods for robots to follow human legs. Among them, the most widely used are the robot automatic following method based on single vision, the robot automatic following method based on single lidar, and the method of tracking human legs based on vision. Influenced by factors, the tracking accuracy is poor; the method of tracking human legs based on lidar, such as the Chinese patent document CN107272680A, discloses a robot automatic following method based on the ROS robot operating system. SUMMARY OF THE INVENTION [0003] The purpose of the present invention is to provide an automatic following method fo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G05D1/02G05D1/12G01S7/48
CPCG05D1/0212G05D1/0231G05D1/0257G05D1/12G01S7/4802
Inventor 戴厚德姚瀚晨朱利琦林名强刘鹏华赵四林陈兴陈鸿宇
Owner QUANZHOU INST OF EQUIP MFG