Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot control method, system and robot based on visual excitement point

A control method and robot technology, applied in the field of robots, to achieve lifelike bionic effects and improve intelligence

Active Publication Date: 2021-11-05
SHANDONG JIAOTONG UNIV +1
View PDF14 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the above-mentioned problem that the existing robot cannot autonomously turn to key positions for action display or turn to the user to interact with people, the first aspect of the present invention provides a robot control method based on visual excitement, which has high intelligence of the robot and Advantages of Bionic Reality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot control method, system and robot based on visual excitement point
  • Robot control method, system and robot based on visual excitement point
  • Robot control method, system and robot based on visual excitement point

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] figure 1 Provide the main principle of the robot control method based on visual excitement point of the present invention, it is:

[0033] S101: Obtain a video of the surrounding environment of the robot, and extract at least one feature point of interest from each frame of image as a pre-selected exciting point;

[0034] S102: If there is only one and only one preselected excitement point extracted, use it as the excitement point directly; if there are at least two preselected excitement points extracted, select the preselected excitement point with the highest priority according to the priority order of the preset feature points as excitement

[0035] S103: According to the motion state of the exciting point, control the robot to perform matching actions.

[0036] Specifically, the feature points of interest include, but are not limited to, feature points of color regions of interest, feature points of recorded user faces, or feature points of moving objects.

[00...

Embodiment 2

[0055] Such as image 3 As shown, the robot control system based on visual excitement of the present embodiment includes:

[0056] (1) pre-selected exciting point extraction module, which is used to obtain the surrounding environment video of the robot, and extracts at least one feature point of interest from each frame image as a pre-selected exciting point;

[0057] In a specific implementation, the feature points of interest include, but are not limited to, feature points of a color region of interest, feature points of a recorded user's face, or feature points of a moving object.

[0058] (2) Exciting point screening module, which is used to directly serve as an exciting point if there is only one pre-selected exciting point extracted; if at least two pre-selected exciting points are extracted, then according to the preset feature point priority order , select the pre-selected excitement point with the highest priority as the excitement point;

[0059] (3) Action control...

Embodiment 3

[0064] This embodiment provides a robot, which includes the robot control system based on visual excitement as described in Embodiment 1.

[0065] What needs to be explained here is that other structures of the robot are existing structures, which will not be repeated here.

[0066] In this embodiment, at least one feature point of interest is respectively extracted from each frame image of the surrounding environment video of the robot as a preselected exciting point; the excited point is screened according to the number of extracted preselected exciting points, and when there is only one preselected exciting point, Directly as the excitement point; when there are at least two pre-selected excitement points, according to the priority order of the preset feature points, select the pre-selected excitement point with the highest priority as the excitement point, and finally control the robot to perform matching actions according to the motion state of the excitement point , real...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of robot control, and in particular relates to a robot control method, system and robot based on visual excitement points. It solves the problem that the existing robot cannot autonomously turn to the key position for action display or turn to the user to interact with people. It has the advantages of high intelligence of the robot and realistic bionic effect. Its technical solution is: a robot control based on visual excitement The method comprises acquiring the video of the surrounding environment of the robot, and extracting at least one feature point of interest from each frame image as a pre-selected exciting point; if there is only one pre-selected exciting point extracted, it is directly used as the exciting point; If there are at least two pre-selected excitement points, the pre-selected excitement point with the highest priority is selected as the excitement point according to the priority order of the preset feature points; according to the motion state of the excitement point, the robot is controlled to perform matching actions.

Description

technical field [0001] The invention belongs to the field of robot control, and in particular relates to a robot control method, system and robot based on visual excitement points. Background technique [0002] The statements in this section merely provide background information related to the present invention and do not necessarily constitute prior art. [0003] Existing bionic robots can simulate some behaviors of pets, such as sitting down, reaching out, acting like a baby, etc., but the inventors found that due to the lack of processing of the surrounding environment image or the slow speed of image processing, it is rarely possible to do anything about the surrounding environment seen. Problems with unresponsive or slow reactions, such as recognizing signs of a fixed color, looking in the direction of the user and others for feedback, chasing moving objects, etc. Contents of the invention [0004] In order to solve the above-mentioned problem that the existing robot...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16
CPCB25J9/1602B25J9/1664B25J9/1697
Inventor 范永武曌晗张辰谢爱珍陈彬
Owner SHANDONG JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products