Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Semantic map construction method based on depth camera and sweeping robot

A sweeping robot and semantic map technology, applied in the field of sweeping robots, can solve problems such as limited environmental space information, limited working space, and less information, and achieve the effect of expanding the effective working space and improving accuracy

Pending Publication Date: 2020-09-18
BEIJING QIHOO TECH CO LTD
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in the existing SLAM mapping method based only on lidar, lidar can only detect obstacle information on a 2D plane, and cannot detect information on the vertical direction of obstacles. The map constructed is a two-dimensional map, and the environment provided Space information is limited, and for some special obstacles (such as tables and chairs with hollow structures, etc.), it cannot be effectively detected and processed by lidar. In addition, since lidar must be configured at a position with a certain height to work effectively , so that the sweeping robot cannot be ultra-thin, making it impossible for the sweeping robot to enter a space with a small vertical distance to perform work
Therefore, the existing SLAM mapping method based only on lidar has the problem that the constructed map provides little information and low mapping accuracy, and there are problems that the sweeping robot cannot be ultra-thin and the working space is limited.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Semantic map construction method based on depth camera and sweeping robot
  • Semantic map construction method based on depth camera and sweeping robot
  • Semantic map construction method based on depth camera and sweeping robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0066] Embodiments of the present application are described in detail below, and examples of the embodiments are shown in the drawings, wherein the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary only for explaining the present application, and should not be construed as limiting the present invention.

[0067] Those skilled in the art will understand that the singular forms "a", "an" and "the" used herein may also include plural forms unless otherwise stated. It should be further understood that the word "comprising" used in the description of the present application refers to the presence of features, integers, steps, operations, elements and / or components, but does not exclude the presence or addition of one or more other features, integers, Steps, operations, elements, components and / or groups thereof. It will be und...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a semantic map construction method based on a depth camera and a sweeping robot, and is applied to the technical field of robots. The method comprises the following steps: obtaining images of the sweeping robot at multiple positions in the located environment space through the depth camera, wherein the image comprises an RGB image and a depth image; constructing a three-dimensional map of the environment space through a simultaneous localization and mapping (SLAM) algorithm based on each depth map; carrying out semantic recognition on each RGB image through a pre-trainedneural network recognition model; obtaining semantic information of each obstacle in the environment space; fusing the three-dimensional map and the obtained semantic information of each obstacle toobtain a three-dimensional semantic map, that is, the three-dimensional semantic map of the environment space constructed by the application, compared with a two-dimensional map of the environment space constructed according to a laser radar, the richness of information contained in the constructed map of the environment space and the accuracy of the constructed map are improved, and the effectiveworking space range of the sweeping robot is expanded.

Description

technical field [0001] The present application relates to the field of robot technology, in particular, the present application relates to a depth camera-based semantic map construction method and a sweeping robot. Background technique [0002] As a smart appliance that can automatically clean the area to be cleaned, the sweeping robot can replace people to clean the ground, reducing the burden of people's housework, and is more and more recognized by people. The map construction of the application environment space of the sweeping robot is the basis for the sweeping robot to perform the cleaning work. How to construct the map of the application environment space of the sweeping robot has become a key issue. [0003] The problem to be solved by Simultaneous Localization and Mapping (SLAM) technology is: if a robot is placed in an unknown position in an unknown environment, is there a way for the robot to gradually draw a map that is completely consistent with the environment...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G05D1/02
CPCG05D1/0251G05D1/0257G05D1/0221G05D1/0276
Inventor 潘俊威魏楠哲王继鑫张磊
Owner BEIJING QIHOO TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products