Robot vision semantic navigation method, device and system

A technology of robot vision and navigation methods, applied in the field of devices, systems and computer storage media, and robot vision semantic navigation methods, which can solve problems such as inability to navigate objects

Active Publication Date: 2020-09-11
WUHAN UNIV OF TECH
View PDF10 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to overcome the above-mentioned technical deficiencies, provide a robot visual semantic navigation method, device, s

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot vision semantic navigation method, device and system
  • Robot vision semantic navigation method, device and system
  • Robot vision semantic navigation method, device and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0025] Such as figure 1 As shown, Embodiment 1 of the present invention provides a robot visual semantic navigation method, comprising the following steps:

[0026] S1. Collect the scene images taken by the robot, and at the same time collect the voice commands received by the robot, and establish a scene image set and a voice command set;

[0027] S2. Mark the image features of each scene image in the scene image set, and mark the voice features of each voice command in the voice command set;

[0028] S3. Combining the image features and voice features at the same time to construct a semantic map, obtain a semantic map set, and mark the semantic features of each semantic map in the semantic map set;

[0029] S4, fusing image features, voice features and semantic features at the same time to construct a state vector to obtain a state vector set;

[0030] S5. Mark the action sequence corresponding to each state vector in the state vector set, and use the state vector set as a...

Embodiment 2

[0071] Embodiment 2 of the present invention provides a robot visual semantic navigation device, including a processor and a memory, and a computer program is stored on the memory. When the computer program is executed by the processor, the robot visual semantics provided by Embodiment 1 is realized. navigation method.

[0072] The robot visual semantic navigation device provided by the embodiment of the present invention is used to implement the robot visual semantic navigation method. Therefore, the robot visual semantic navigation device also possesses the technical effects of the robot visual semantic navigation method, and will not be repeated here.

Embodiment 3

[0074] Such as figure 2 As shown, Embodiment 3 of the present invention provides a robot visual semantic navigation system, including the robot visual semantic navigation device 1 provided in Embodiment 2, and also includes a robot 2;

[0075] Described robot 2 comprises visual collection module, voice collection module, communication module and mobile control module;

[0076] The visual collection module is used to collect scene images;

[0077] The voice collection module is used to collect voice commands;

[0078] The communication module is used to send the scene image and voice instructions to the robot visual semantic navigation device 1, and receive the navigation control instructions sent by the robot visual semantic navigation device 1;

[0079] The movement control module is used for performing navigation control on the robot joints according to the navigation control instructions.

[0080] In this embodiment, the robot visual semantic navigation device 1 can be ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the technical field of robot navigation, and discloses a robot vision semantic navigation method. The robot vision semantic navigation method comprises the following steps that a scene image set and a voice instruction set are established; image features of each scene image in the scene image set are marked, and voice features of each voice instruction in the voice instruction set are marked; semantic graphs are construction by combining the image features and the voice features at the same time to obtain a semantic graph set, and semantic features of each semantic graph in the semantic graph set are marked; state vectors are constructed by fusing the image features, the voice features and the semantic features at the same time to obtain a state vector set; actionsequences of each state vector in the state vector set are marked, a deep strength learning model is trained with the state vector set as a training sample, and a navigation model is obtained; and a robot is subjected to navigation control according to the navigation model. According to the robot vision semantic navigation method, the navigation of objects not in the scope of robot vision can be achieved.

Description

technical field [0001] The invention relates to the technical field of robot navigation, in particular to a robot visual semantic navigation method, device, system and computer storage medium. Background technique [0002] Semantic and goal-oriented navigation are challenging tasks, and in everyday life, visual navigation involves multiple problems. First, the robot may not know information about the environment, in which case the robot needs to explore the environment to gain a better understanding of that environment. Second, the target object may not be visible when the robot starts to navigate, or may be out of view during the navigation. Therefore, robots need to learn effective search strategies to find target objects. In the end, the object may be visible, but planning a reasonable path to the object is another problem the robot needs to deal with. [0003] The previous navigation method was map-based navigation SLAM (Simultaneous Localization and Mapping, real-tim...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): B25J9/16
CPCB25J9/16B25J9/1697B25J9/1664
Inventor 宋华珠金宇
Owner WUHAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products