Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot realization method, control method, robot and electronic device

An implementation method and robot technology, applied in the field of robotics, can solve the problems of many control instructions and high user requirements for video streaming, and achieve the effects of low control frequency, friendly interactive experience, and simple motion control.

Inactive Publication Date: 2017-12-26
BEIJING DEEPGLINT INFORMATION TECH
View PDF6 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The embodiment of the present application proposes a robot implementation method, control method, robot and electronic equipment to solve the video stream delay requirements and user requirements in the prior art for remote control through motion direction and speed commands , many control instructions and other technical problems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot realization method, control method, robot and electronic device
  • Robot realization method, control method, robot and electronic device
  • Robot realization method, control method, robot and electronic device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] figure 1 It shows a schematic flowchart of the implementation of the robot implementation method in the embodiment of the present application. As shown in the figure, the robot implementation method may include the following steps:

[0040] Step 101, sending the acquired real scene image and its synchronization information;

[0041] Step 102, receiving the target position and synchronization information determined by the user according to the real-scene image;

[0042] Step 103. Determine the target position of the robot according to the target position and the synchronization information;

[0043] Step 104, generating a planned path according to the obstacle information of the current scene and the target position of the robot;

[0044] Step 105, move to the target location according to the planned route.

[0045] Wherein, the synchronization information may include odometer information, and the odometer information may include information such as starting from the ...

Embodiment 2

[0097] The embodiment of the present application also provides a robot control method, which is described as follows from the perspective of the control terminal.

[0098] figure 2 It shows a schematic flowchart of the implementation of the robot control method in the embodiment of the present application. As shown in the figure, the robot control method may include the following steps:

[0099] Step 201, receiving the real scene image and synchronization information sent by the robot;

[0100] Step 202, determining the target position selected by the user on the real-scene image;

[0101] Step 203, sending the determined target location and the synchronization information.

[0102] In the embodiment of the present application, the control terminal only needs to receive the real-scene image sent by the robot, and select the target position by clicking on the real-scene image to realize the remote control of the robot to move to the designated position, without the need for th...

Embodiment 3

[0122] Assume that the robot takes a real-scene image every 10s while traveling according to the previously planned path.

[0123] Image 6 A schematic diagram of the usage scene in the embodiment of the present application is shown. As shown in the figure, the robot interacts with the control terminal through the cloud in the three-dimensional real space, and the user of the control terminal views the displayed two-dimensional image and specifies the target point.

[0124] The robot uses the camera to capture the real scene image (or video) of the current scene, assuming that it is the first frame of the real scene image taken by the robot at 00:50, the real scene image can be an RGB two-dimensional image; and the attitude information of the current robot is obtained by using the odometer, For example: position (0m, 0m, 0m), facing 0rad, use obstacle sensing module (for example: ultrasonic, radar, infrared, depth camera, etc.) to perceive obstacle information, including obsta...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a robot realization method, a control method, a robot and an electronic device. The robot realization method comprises steps: the robot sends an acquired real scene image and synchronization information thereof, and a control end determines a target position according to the real scene image and the target position and the synchronization information are sent to the robot; according to the target position and the synchronization information, the robot determines the target position of the robot; according to the obstacle information of the current scene and the target position of the robot, a planning path is generated; and according to the planning path, the robot is moved to the target position. By adopting the technical scheme provided by the invention, a user only needs to specify a target point on the real scene image uploaded by the robot, the robot can be autonomously navigated to the target point according to the target point selected by the user, the motion control on the robot is more convenient, the control frequency is lower, the control instruction amount is reduced, and a friendly interactive experience can also be provided in a condition with a poor remote control network environment.

Description

technical field [0001] The present application relates to the technical field of robots, and in particular to a robot realization method, a control method, a robot, and electronic equipment. Background technique [0002] The existing remote robot control technology is generally used in the motion control of products such as telepresence robots and service robots (such as security patrol robots and logistics robots). [0003] The motion control interaction mode of the telepresence robot is: the user monitors the video stream sent back by the robot at the remote control terminal, and sends direction and speed control instructions to the robot through the joystick (physical or virtual), keyboard, mouse, touch screen, etc. to adjust the robot The user must always pay attention to the video stream transmitted by the robot while controlling the movement of the robot, and make decisions on the next step of motion adjustment through the video stream. [0004] This method of remote ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G05D1/02
CPCG05D1/0246
Inventor 任光阔潘争赵勇
Owner BEIJING DEEPGLINT INFORMATION TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products