Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A robot intelligent self-following method, device, medium, and electronic equipment

A robot intelligence and robot technology, applied in cleaning equipment, machine parts, instruments, etc., can solve the problems of being easily affected by day and night operations, poor robustness of cleaning robots, and low tracking accuracy, achieving strong adaptability and anti-interference. ability, improve tracking accuracy, and meet computing power requirements

Active Publication Date: 2021-12-17
广东盈峰智能环卫科技有限公司 +1
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] One aspect of the present invention provides an intelligent self-following method to solve the technical problems of poor environmental robustness, easy to be affected by day and night operations, low positioning accuracy and low tracking accuracy of existing cleaning robots during self-following

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A robot intelligent self-following method, device, medium, and electronic equipment
  • A robot intelligent self-following method, device, medium, and electronic equipment
  • A robot intelligent self-following method, device, medium, and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be described in detail below with reference to the accompanying drawings and examples.

[0049] refer tofigure 1 , the preferred embodiment of the present invention provides a kind of robot intelligent self-following method, comprises steps:

[0050] S2. Perform quantile processing on the i-th line point cloud data obtained by multi-line lidar single-frame scanning, and obtain the average value of the point cloud data located between two adjacent quantile points as the quantile value Ω;

[0051] S3. Use DBSCAN to perform point cloud clustering on the i-th line point cloud data, and obtain the value λi with the smallest value in the i-th line point cloud data, wherein the Eps parameter and MinPts parameter in DBSCAN are segmented according to the quantile value Ω set up;

[0052] ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot intelligent self-following method, device, medium, and electronic equipment. The intelligent self-following method includes the steps of: performing quantization processing on the i-th line point cloud data obtained by multi-line lidar single-frame scanning to obtain The average value of the point cloud data located between two adjacent quantile points is taken as the quantile value Ω; use DBSCAN to perform point cloud clustering on the i-th line point cloud data, and obtain the smallest value λi in the i-th line point cloud data ;Repeat the above steps to obtain n λi of the multi-line laser radar, and then take the average value λ of n λi as the detection result of the multi-line laser radar single-frame scan; calculate the average of the current single-frame scan and the previous historical single-frame scan Euclidean distance d of the value λ and judge whether the target is lost. The present invention is applicable both day and night, and has little impact on the environment; during the following process, the tracking result of the operator is obtained by using the multi-line laser radar, and the detection accuracy is high, and the tracking accuracy is high when combined with the historical track.

Description

technical field [0001] The present invention relates to the field of robot control, in particular, to a robot intelligent self-following method, device, medium, and electronic equipment. Background technique [0002] At present, cleaning robots are widely used in areas such as communities, parks, pedestrian streets, and public places for cleaning and cleaning operations due to their automatic obstacle avoidance and automatic following, which improves cleaning efficiency and labor intensity to a certain extent. The existing cleaning robot generally uses the camera as the main sensor to identify a specific human body in front and perform visual tracking during operation. Or use ultrasonic as the main sensor, and use ultrasonic positioning separated by sending and receiving for detection and tracking. However, the visual recognition through the camera is easily affected by the external environment such as light, and the RGB image information cannot be used at night. It is diff...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): A47L11/24A47L11/40G06K9/62G06T7/246
CPCA47L11/24A47L11/40A47L11/4011G06T7/246G06T2207/10028G06F18/2321
Inventor 张岁寒张斌陈凯李亮周诚远瞿静龚建球罗新亮晋亚超
Owner 广东盈峰智能环卫科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products