Supercharge Your Innovation With Domain-Expert AI Agents!

Man-machine interaction method and man-machine interaction device

A technology of human-computer interaction and gesture action, applied in the input/output of user/computer interaction, computer parts, mechanical mode conversion, etc. The effect of privacy

Pending Publication Date: 2021-05-04
HUAWEI TECH CO LTD
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At this time, it is difficult for the target device to judge which gesture action is valid or which user is a valid user, so that effective human-computer interaction cannot be realized

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Man-machine interaction method and man-machine interaction device
  • Man-machine interaction method and man-machine interaction device
  • Man-machine interaction method and man-machine interaction device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0112] This embodiment relates to a method for air-space control of a vehicle through gesture actions.

[0113] First refer to figure 1 A general description of the interaction scene of this embodiment is given.

[0114] like figure 1 As shown, in this embodiment, an example of a person in the human-computer interaction is the user 300 , an example of the target device is the vehicle 100 , and an example of the mobile terminal is the smart phone 200 . Specifically, the vehicle 100 is parked in the parking space 601 of the parking lot, and the user 300 intends to control the vehicle 100 through air gestures. The mobile phone 200 and the vehicle 100 initiate a Bluetooth connection or a UWB connection), and after the vehicle 100 successfully authenticates the ID (Identification, identity, identification) of the smart phone 200, the two establish a connection. Afterwards, the user 300 performs a predetermined operation on the smart phone 200. The predetermined operation indicat...

Embodiment 2

[0230] Embodiment 2 of the present application will be described below.

[0231] This embodiment relates to a method for summoning a vehicle through a user's gesture action.

[0232] Specifically, in this embodiment, refer to Figure 9 , the user 301 operates the taxi-hailing software on the smart phone 201 to reserve an unmanned taxi (Robotaxi) through the cloud server 400. At this time, the smart phone 201 sends its location information to the cloud server 400 through the communication network, and the cloud server 400 is dispatched. After the processing, the vehicle 101 is selected as an unmanned taxi, and the location information of the smart phone 201 is sent to the vehicle 101 through the communication network, and the vehicle 101 travels to the user 301 according to the location information. When arriving near the user 301 (for example, 100 meters or tens of meters), the vehicle 101 hopes to know the precise location of the user 301 in order to provide more detailed se...

Embodiment 3

[0304] This embodiment relates to a method for a user to interact with a meal delivery robot.

[0305] Recently, more and more restaurants use food delivery robots for food delivery. At this time, it is generally necessary to pre-set the position of the specific dining table for the food delivery robot to deliver the food accurately, which prevents the customer from choosing the position freely or changing the position after selecting the position.

[0306] In addition, sometimes, multiple customers on the same table order separately, or some tables are long rows of tables (common in fast food restaurants), at this time, the robot cannot accurately distinguish which customer is the correct delivery Objects, unable to provide more detailed services (such as facing customers from the best angle). If customers are required to wave, for example during rush hour, there may be multiple people waving, confusing the robot.

[0307] To this end, this embodiment provides a method for ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a man-machine interaction method and the like. When the user interacts with the object equipment, the user makes a gesture action by using a hand holding the mobile terminal; at the moment, on one hand, gesture action information of a user is detected according to an optical sensor of the object equipment; on the other hand, terminal motion trail information, which is motion trail information of the mobile terminal moving along with the hand of the user, is detected according to a motion sensor of the mobile terminal. Then, whether the gesture action information is matched with the terminal movement track information or not is judged; and if so, executing the corresponding first control. As the mobile terminal moves along with the hand of the user, the movement track information of the mobile terminal uniquely corresponds to the gesture action information of the hand of the user, so that whether the gesture action is effective or not can be reliably judged by judging whether the gesture action information is matched with the terminal track information or not, interference of the gesture action of irrelevant personnel is avoided, and effective man-machine interaction can be realized.

Description

technical field [0001] The present application relates to a human-computer interaction method and a human-computer interaction device. Background technique [0002] In the prior art, there is a technology for the user to interact with the target device through air gestures and other air operations. For example, the user interacts with the vehicle as the target device through gestures outside the car to control the vehicle to start in advance or command the vehicle Carry out reversing into storage, etc. [0003] At this time, in order to avoid illegal control, it is necessary to authenticate the validity of the gesture action or the validity of the user identity. Specifically, for example, when a valid user makes a gesture, there may be another person (invalid user) beside him and the other person also makes a gesture at roughly the same time. At this time, it is difficult for the target device to determine which gesture action is valid or which user is a valid user, so tha...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06K9/00G06K9/62
CPCG06F3/017G06V40/28G06F18/22G06F3/0304G06F3/0346B60W60/00253B60K35/00B60K35/28B60K35/29B60K2360/176B60K2360/191
Inventor 彭帅华武昊
Owner HUAWEI TECH CO LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More