Symmetric environment repositioning method and device and robot

A robot and relocation technology, applied in the field of sweeping robots, can solve problems such as relocation errors, and achieve the effect of partial relocation

Pending Publication Date: 2021-01-26
SHENZHEN TOPBAND
0 Cites 1 Cited by

AI-Extracted Technical Summary

Problems solved by technology

[0005] The invention provides a relocation method for a symmetrical environment...
View more

Method used

The application includes the environmental information of the robot's environment and the heading angle information of the robot in the point cloud information, and the environmental information is the environmental profile of the robot's environment that is obtained by collecting the environment through the laser radar point cloud. According to the heading The angle information limits the first angle of the robot’s relocation, and then extracts the target environment information corresponding to the first angle in the environment information, thereby limiting the environment profile of the robot’s relocation, so the position of the robot in the world map can be positioned according to the target environment information The target position, which is the real-time positioning position of the robot, is selected as a correct position through t...
View more

Abstract

The invention is suitable for the technical field of sweeping robots, and provides a symmetric environment repositioning method and device and a sweeping robot, and the method comprises the steps: obtaining point cloud information of an environment where the robot is located, and the point cloud information comprises the environment information of the environment where the robot is located and thecourse angle information of the robot; determining a first angle for limiting repositioning of the robot according to the course angle information; extracting target environment information matched with the first angle in the environment information; and positioning a target position of the robot in a preset world map according to the target environment information, and taking position information mapped by the target position as real-time positioning information of the robot. According to the embodiment of the invention, an IMU is used for assisting repositioning so as to limit the repositioning environment contour of the robot, so that a correct position can be screened out according to the heading angle and the pose angle of the robot, wrong symmetrical positions are eliminated, and local repositioning of the sweeping robot in a symmetrical environment is realized.

Application Domain

Position/course control in two dimensions

Technology Topic

World mapEngineering +3

Image

  • Symmetric environment repositioning method and device and robot
  • Symmetric environment repositioning method and device and robot
  • Symmetric environment repositioning method and device and robot

Examples

  • Experimental program(8)

Example Embodiment

[0047]Example one
[0048]In some alternative embodiments, seefigure 1 ,figure 1 It is a schematic flowchart of a symmetric environment relocation method according to an embodiment of the present application.
[0049]Such asfigure 1 As shown, this application provides a symmetric environment relocation method, including:
[0050]S1100. Acquire point cloud information of the environment where the robot is located, where the point cloud information includes environmental information of the environment where the robot is located and heading angle information of the robot;
[0051]Point cloud information can be environmental information collected by lidar. Lidar is a measurement equipment that integrates laser scanning and positioning and attitude determination systems. The lidar system includes a laser and a receiving system. The working principle is that the laser generates and transmits A beam of light pulse hits the object and is reflected back, and is finally received by the receiver. The receiver accurately measures the propagation time of the light pulse from emission to reflection. Since the speed of light is known, the propagation time can be converted into a measurement of the distance. Combining the height of the laser and the scanning angle of the laser, the three-dimensional coordinates X, Y, Z of each ground spot can be accurately calculated. Point cloud data refers to a set of vectors in a three-dimensional coordinate system. These vectors are usually expressed in the form of X, Y, and Z three-dimensional coordinates, and are generally mainly used to represent the outer surface shape of an object. Point cloud data is also It can represent the RGB color, gray value, depth, and segmentation result of a point.
[0052]The point cloud information includes the environment information of the robot's environment and the robot's heading angle information. During implementation, the environment contour information collected by the lidar point cloud can be used as the environment information of the robot's environment, and the robot's heading angle information is sweeping. The robot’s custom orientation angle can be measured by the IMU on the map. The orientation of the robot when it is turned on determines the zero point of the IMU’s heading angle. After the map is created, the IMU’s heading angle is consistent with the world map. The included angles of the world axis (such as X, Y, Z axis) are fixed. IMU (Inertial Measurement Unit) is a device for measuring the three-axis attitude angle and acceleration of an object. Generally, IMU includes a three-axis gyroscope and a three-axis accelerometer. That is, when the machine starts up, the heading angle of the robot is calculated by the IMU algorithm, and the sweeping robot scans the environment information of the robot's environment through the lidar during the work process, and combines the environment information and heading angle information into point cloud information.
[0053]S1200: Determine the first angle that limits the robot's relocation according to the heading angle information;
[0054]S1300. Extract target environment information that is adapted to the first angle from the environment information;
[0055]The first angle of robot relocation is determined by the heading angle. During implementation, the sweeping robot collects environmental contours through the lidar point cloud by scanning the surrounding environment in all directions. By limiting the first angle of robot relocation, the robot can be restricted to The collection angle of environmental information or the use of restricted environmental information to determine the location of the robot.
[0056]S1400: Locate the target position of the robot on the preset world map according to the target environment information, and use the position information mapped to the target position as real-time positioning information of the robot.
[0057]The system limits the target environment information in the collected environment information according to the heading angle, and compares and matches the target environment information in the world map to locate the target position of the robot on the world map. The target position is the real-time positioning of the robot Position. In a symmetrical environment, the lidar point cloud scans the surrounding environment in all directions, which is prone to positioning errors. For example, in a rectangular room, the sweeping robot is located at any one of the four corners of the rectangle, and the robot and the corner When moving the robot to other corners, if the distance between the robot and the walls on both sides meets the positioning conditions, multiple overlapping positions will be generated. It is considered that the positions of the four corners are all before being moved. Position the position, but only one of the four corners is correct, and the remaining positions are wrong, making the robot incorrectly positioned to a symmetrical position.
[0058]The heading angle information is used to limit the first angle of the robot's relocation, so that the robot only uses the environmental information within the limited angle range. Please refer to 2.figure 2 It is a schematic diagram of the position of the robot in the rectangular environment of the symmetric environment relocation method of this application, such asfigure 2 As shown, the rectangular environment includes position A, position B, and position C, where position A is located at the lower left corner of the rectangular environment, position B is located at the lower right corner of the rectangular environment, position C is located in the middle of the rectangular environment, and the initial position of the robot is located at position A , When the robot is moved to position C, due to the characteristics of the lidar point cloud, the robot is determined to move to position C, and the cleaning trajectory is re-planned to complete the cleaning task; and when the robot is moved from position A At position B, for example, at position A, the distance between the robot and the walls on both sides is 2 cm. At position B, the distance between the robot and the walls on both sides is also 2 cm. At this time, the robot does not perform partial relocation and cannot complete the current cleaning. Task, resulting in an error in mapping. This application restricts the first angle of relocation of the robot. For example, the robot only scans the environmental information in the 90° angle range between the left and the lower side. When the robot is transported from position A to position B, the distance to the left wall of the robot changes At this time, the robot performs partial repositioning to filter out the only correct position.
[0059]In some embodiments, seeimage 3 ,image 3 It is a schematic diagram of the heading and position of the robot in the symmetric environment relocation method of this application, such asimage 3 As shown, the robot is in position 1 before being moved. Taking the IMU heading of the robot at position 1 as an example, the position of the sweeping robot on the world map is (X1, Y1, 45°); after the sweeping robot is being moved Placed in position 2, position 2 does not change the orientation of the robot. At this time, the heading of the IMU has not changed. The pose of the robot on the map is (X2, Y2, phi2). According to the corresponding relationship between the IMU and the pose angle, the current The pose angle of the robot phi2=45°; and according to the environmental information observed by the radar point cloud,figure 2 The position 3 in is also consistent with the result of point cloud matching, and the pose of the machine in the map is (X3, Y3, phi3). At this time, the angle calculated by the IMU algorithm is phi3=-170°, which is not the same as the heading angle of the IMU. Matches to eliminate position 3, which can effectively eliminate misaligned positions.
[0060]In this application, the point cloud information includes the environment information of the robot's environment and the heading angle information of the robot. The environment information is the environmental contour of the robot's environment obtained through the environment collection of the lidar point cloud, and the heading angle information is limited The first angle of the robot relocation, and then extract the target environment information corresponding to the first angle in the environment information, thereby limiting the environment contour of the robot relocation, so the target position of the robot on the world map can be located according to the target environment information, The target position is the real-time positioning position of the robot, which is screened out as a correct position by the angle of the heading angle and the pose of the robot, and the wrong symmetrical position is eliminated to realize the local repositioning of the sweeping robot in a symmetrical environment.

Example Embodiment

[0061]Example two
[0062]In some alternative embodiments, seeFigure 4 ,Figure 4 It is a schematic diagram of the process of loading a map in an embodiment of the symmetric environment relocation method of the present application.
[0063]Such asFigure 4 As shown, before the step of obtaining point cloud information of the environment where the robot is located, it also includes:
[0064]S1010. Obtain storage address information of a world map, where the world map includes a target area composed of multiple cleaning areas;
[0065]During the working process of the robot, the cleaning environment is scanned and the cleaning trajectory is planned to construct a world map, and the cleaning environment is divided into multiple cleaning areas. For example, the home cleaning area is divided into multiple cleaning areas on the home cleaning map (e.g. Grid area), multiple cleaning areas form the target area. Taking the family environment including the living room and bedroom as an example, the living room and bedroom are each a target area, and the living room includes sofa cleaning area, carpet cleaning area, and dining table cleaning area. Including the cleaning area under the bed and the dressing table cleaning area.
[0066]S1020. Read the world map according to the storage address information, and load the target area.
[0067]The world map is stored in the local database or cloud server. The world map can be obtained through the storage address of the world map and the world map can be loaded to load the target area. When locating the target position of the robot on the world map, the target environment information Compare and match each cleaning area, or compare and match the target environment information with the environmental contour of each cleaning area, so as to accurately locate the robot position.

Example Embodiment

[0068]Example three
[0069]In some alternative embodiments, seeFigure 5 ,Figure 5 It is a schematic diagram of the process of locating a target position in an embodiment of the symmetric environment relocation method of the present application.
[0070]Such asFigure 5 As shown, the steps of locating the target position of the robot on the preset world map according to the target environment information include:
[0071]S1410. Extract any cleaning area in the target area as the target location, and compare the target location with the environment represented by the target environment information;
[0072]S1420: When the target location does not match the environment represented by the target environment information, extract the next cleaning area as the target location until the target location matches the environment represented by the target environment information.
[0073]The target area includes multiple cleaning areas. The system compares the cleaning area in the target area with the target environment information to find a target location that matches the target environment information.figure 2 As an example, the rectangular environment corresponds to the target area, and position A, position B, and position C correspond to the cleaning area. Of course, during implementation, the rectangular environment also includes other cleaning areas. Here, only position A, position B, and position C are taken as examples. When the robot is transported from position A to position B, the robot collects the target environment information of position B through the lidar point cloud, and then compares the cleaning area corresponding to position A, position B and position C on the world map with the environment of position B Compare and match, thereby positioning the robot at position B, eliminating the wrong symmetrical position A, and improving the accuracy of local relocation.

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products