Visual excitement point-based robot control method and system and robot

A control method and control system technology, applied in the field of robotics, to achieve realistic bionic effects and improve intelligence

Active Publication Date: 2020-09-04
SHANDONG JIAOTONG UNIV +1
View PDF14 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the above-mentioned problem that the existing robot cannot autonomously turn to key positions for action display or turn to the user to interact with people, the first aspect of the present invention provides a robot control method based on visual excitement, which has high intelligence of the robot and Advantages of Bionic Reality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual excitement point-based robot control method and system and robot
  • Visual excitement point-based robot control method and system and robot
  • Visual excitement point-based robot control method and system and robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] figure 1 Provide the main principle of the robot control method based on visual excitement point of the present invention, it is:

[0033] S101: Obtain a video of the surrounding environment of the robot, and extract at least one feature point of interest from each frame of image as a pre-selected exciting point;

[0034] S102: If there is only one and only one preselected excitement point extracted, use it as the excitement point directly; if there are at least two preselected excitement points extracted, select the preselected excitement point with the highest priority according to the priority order of the preset feature points as excitement

[0035] S103: According to the motion state of the exciting point, control the robot to perform matching actions.

[0036] Specifically, the feature points of interest include, but are not limited to, feature points of color regions of interest, feature points of recorded user faces, or feature points of moving objects.

[00...

Embodiment 2

[0055] like image 3 As shown, the robot control system based on visual excitement of the present embodiment includes:

[0056] (1) pre-selected exciting point extraction module, which is used to obtain the surrounding environment video of the robot, and extracts at least one feature point of interest from each frame image as a pre-selected exciting point;

[0057] In a specific implementation, the feature points of interest include, but are not limited to, feature points of a color region of interest, feature points of a recorded user's face, or feature points of a moving object.

[0058] (2) Exciting point screening module, which is used to directly serve as an exciting point if there is only one pre-selected exciting point extracted; if at least two pre-selected exciting points are extracted, then according to the preset feature point priority order , select the pre-selected excitement point with the highest priority as the excitement point;

[0059] (3) Action control mo...

Embodiment 3

[0064] This embodiment provides a robot, which includes the robot control system based on visual excitement as described in Embodiment 1.

[0065] What needs to be explained here is that other structures of the robot are existing structures, which will not be repeated here.

[0066] In this embodiment, at least one feature point of interest is respectively extracted from each frame image of the surrounding environment video of the robot as a preselected exciting point; the excited point is screened according to the number of extracted preselected exciting points, and when there is only one preselected exciting point, Directly as the excitement point; when there are at least two pre-selected excitement points, according to the priority order of the preset feature points, select the pre-selected excitement point with the highest priority as the excitement point, and finally control the robot to perform matching actions according to the motion state of the excitement point , real...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of robot control, and particularly relates to a visual excitement point-based robot control method and system and a robot. The method solves the problem that an existing robot cannot automatically turns to a key point for showing actions or turns to a user for interaction with the user, and has the advantages of high intelligence of the robot and vividness of bionic effect. According to the technical scheme, the visual excitement point-based robot control method comprises the following steps that a surrounding environment video of the robot is obtained, at least one interesting feature point is extracted from each frame image to serve as a pre-selected excitement point; If only one extracted pre-selected point exists, then the point directly serves as theexcitement point; if at least two extracted pre-selected points exist, according to the preset priority order of the feature points, the pre-selected excitement point with the highest priority is selected as the excitement point; and according to a motion state of the excitement point, the robot is controlled to perform matched action.

Description

technical field [0001] The invention belongs to the field of robot control, and in particular relates to a robot control method, system and robot based on visual excitement points. Background technique [0002] The statements in this section merely provide background information related to the present invention and do not necessarily constitute prior art. [0003] Existing bionic robots can simulate some behaviors of pets, such as sitting down, reaching out, acting like a baby, etc., but the inventors found that due to the lack of processing of the surrounding environment image or the slow speed of image processing, it is rarely possible to do anything about the surrounding environment seen. Problems with unresponsive or slow reactions, such as recognizing signs of a fixed color, looking in the direction of the user and others for feedback, chasing moving objects, etc. Contents of the invention [0004] In order to solve the above-mentioned problem that the existing robot...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16
CPCB25J9/1602B25J9/1664B25J9/1697
Inventor 范永武曌晗张辰谢爱珍陈彬
Owner SHANDONG JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products