Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dynamic learning method and system for robot, robot and cloud server

A dynamic learning and robot technology, applied in the field of robot interaction, can solve the problems of inability to learn dynamically from people and the environment, and cannot establish deep artificial intelligence, so as to achieve the effect of improving user experience

Active Publication Date: 2018-06-29
CLOUDMINDS SHANGHAI ROBOTICS CO LTD
View PDF3 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Although the robot disclosed in the above patents can realize the coordination of robot actions and voice, it cannot dynamically learn people and the environment, and cannot establish deep artificial intelligence.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic learning method and system for robot, robot and cloud server
  • Dynamic learning method and system for robot, robot and cloud server
  • Dynamic learning method and system for robot, robot and cloud server

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0052] Please refer to figure 2 , this embodiment relates to a dynamic learning robot. In this embodiment, the rule base and annotation base are set on the robot terminal. The dynamic learning robot includes a training learning module 20 , a working module 30 , a task execution module 40 , a sending module 14 and an acquisition module 12 .

[0053] The training and learning module 20 includes a dynamic labeling module 22 , a rule updating module 26 and a labeling updating module 24 .

[0054] Wherein, the dynamic labeling module 22 dynamically labels the attribution-use relationship of objects and people in the three-dimensional environment, and stores it in the label library.

[0055] The dynamic labeling module 22 extracts the point cloud features of the 3D environment through machine vision, and obtains the appearance attributes (including but not limited to color, material), geometric model (object) of the object by identifying the point cloud and visual images. Shape),...

Embodiment 2

[0078] Please refer to image 3 , this embodiment also relates to a cloud server. In this embodiment, the rule base and annotation base are set on the cloud server to reduce the amount of data processing on the robot side and establish a unified artificial intelligence processing framework.

[0079] The cloud server includes a receiving module 102 , a sending module 104 , a storage module 120 , a rule library 122 , an annotation library 124 , a confirmation module 140 and an optimization module 130 .

[0080] The storage module 120 stores annotations from the robot that dynamically annotate the attribution-use relationship of objects and people in the three-dimensional environment, forming an annotation library 124 . The storage module also stores the robot's rule base 122 .

[0081] The receiving module 102 receives new rules established by the robot through interactive demonstration behaviors based on the rule base and annotation base. The receiving module 102 also receiv...

Embodiment 3

[0086] Please refer to figure 2, this embodiment relates to a dynamic learning system for a robot, and the dynamic learning system can be set separately on the robot side. For details, please refer to Embodiment 1. The dynamic learning system can also divide the work between the robot 10 and the cloud server 100 . When the cloud server 100 completes the establishment and updating of the annotation library and the rule library, please refer to Embodiment 2 for specific technical content.

[0087] In general, the robot dynamic learning system mainly includes a training learning module, a working module, a task execution module and an inquiry interaction module.

[0088] The training and learning module is used to perform the following steps: dynamically label the attribution and use relationship of objects and people in the three-dimensional environment to form a label library; obtain the rule library, and establish new rules and new labels based on the rule library and label ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A dynamic learning method for a robot, comprising a training learning mode, the training learning mode comprising the following steps: dynamically annotating an attribute relationship between an object and a person in a three-dimensional environment to form an annotation library; acquiring a rule base, and passing the rule base and the annotation library The interactive demonstration behavior establishes a new rule and a new annotation; when it is confirmed that the established new rule has no conflict with the rule in the rule base, the new rule is updated to the rule base and the new annotation is updated to the annotation library.

Description

technical field [0001] The invention relates to the field of robot interaction, in particular to a dynamic learning method, system, robot and cloud server for a robot. Background technique [0002] With the development of network transmission and big data technology and the improvement of hardware processing capabilities, more and more robots have entered people's family life. [0003] Existing robots are functionally focused on artificial intelligence, such as face recognition, object detection, intelligent voice response, text recognition, etc. In 2015, the speech recognition accuracy rate of research and development institutions such as Baidu, Google, and Sound Hound exceeded 90%. In the 4th International Multi-Channel Speech Separation and Recognition Competition in 2016, the recognition error rate of most participating teams was lower than 7% under the condition of six microphones, and the domestic leader iFLYTEK even dropped to 2.24%, and the accuracy rate is approach...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J11/00
CPCB25J11/0005G06N5/025G06N5/022G06N20/00G06N3/008G05B19/4155G05B2219/39299
Inventor 张站朝
Owner CLOUDMINDS SHANGHAI ROBOTICS CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products