Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Industrial Human-Computer Interaction System and Method Based on Visual and Force-Tactile Augmented Reality

A tactile enhancement and augmented reality technology, applied in the input/output of user/computer interaction, mechanical mode conversion, computer components, etc., can solve the problem of not being able to know whether the user's operation has been recognized by the system in time, and the intelligent interaction method cannot interact. Timely feedback on effectiveness, human-computer interaction to get rid of space constraints and walk around with people, etc., to achieve the effect of improving convenience, saving costs, and improving compatibility

Active Publication Date: 2022-02-11
QINGDAO TECHNOLOGICAL UNIVERSITY
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In this solution, the user and the device must be located in the same location space so that the eyeballs can focus on the controlled device. The human-computer interaction process is still limited by space, and it is impossible for the human-computer interaction to get rid of the space restriction and walk with people, and the recognition image processing process is complicated.
In addition, the intelligent interaction method of this technical solution cannot timely feedback the effectiveness of the interaction to the user, that is, it cannot know in time whether the user's operation has been recognized by the system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Industrial Human-Computer Interaction System and Method Based on Visual and Force-Tactile Augmented Reality
  • Industrial Human-Computer Interaction System and Method Based on Visual and Force-Tactile Augmented Reality
  • Industrial Human-Computer Interaction System and Method Based on Visual and Force-Tactile Augmented Reality

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0050] see figure 1 , an industrial human-computer interaction system based on visual and haptic augmented reality, including controlled equipment, augmented reality smart devices, and fingertip force / tactile feedback devices worn on or held by the operator's fingertips;

[0051] The augmented reality smart device runs the controlled device App, uses augmented reality technology to superimpose the controlled device App software interface on the physical environment for display, and uses the sensor detection on the augmented reality smart device to calculate the position of the fingertip force sense / tactile feedback device , and then judge whether the fingertip force sense / haptic feedback device collides with the software interface of the controlled device App. If there is no collision, send a non-collision command to the controlled device App; if there is a collision, send an interface collision command and Collision point: The controlled device App analyzes the collision comm...

Embodiment 2

[0064] This embodiment solves the problem that the augmented reality smart device fails to provide the controlled device App, and proposes to use the App management software to manage and control the controlled device APP and the fingertip force / tactile feedback device to realize collision detection and human-computer interaction.

[0065] see figure 2 with image 3 , in this embodiment, the industrial human-computer interaction system includes a controlled device, an augmented reality smart device, a fingertip force / tactile feedback device, and a cloud server for downloading the App of the controlled device, and the augmented reality smart The device also includes App management software, which first runs the App management software on the augmented reality smart device, and then finds and runs the App of the controlled device through the App management software; the App management software includes App data storage table, enhanced Realistic registration module, interaction...

Embodiment 3

[0087] see figure 1 , an industrial human-computer interaction method based on visual and haptic augmented reality, including the following steps:

[0088] Step 10, the augmented reality smart device runs the controlled device App, establishes a connection between the controlled device App and the controlled device, and superimposes the controlled device App software interface on the physical environment for display;

[0089] Then execute step 20-step 30 to carry out the interaction between the person-controlled device App-controlled device:

[0090] Step 20. Use the sensor on the augmented reality smart device to detect and calculate the position of the fingertip force sense / tactile feedback device, and then judge whether the fingertip force sense / tactile feedback device collides with the App software interface of the controlled device. The controlled device App and the fingertip force sense / haptic feedback device send a non-collision command. If there is a collision, the in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to an industrial human-computer interaction system based on visual and haptic augmented reality, including a controlled device, an augmented reality smart device, and a fingertip force / tactile feedback device worn on the operator's fingertip or held by the operator The augmented reality smart device operates the controlled device App, and the controlled device App software interface is superimposed on the physical environment for display, and the sensor detection and calculation fingertip force / tactile feedback device position is utilized on the augmented reality smart device, and then Determine whether the fingertip force sense / haptic feedback device collides with the software interface of the controlled device App. If there is a collision, send the interface collision command and the collision point to the controlled device App. The controlled device App analyzes the collision command and the collision point, and generates a mouse command , execute the mouse command to generate the corresponding device operation command, and send the device operation command to the controlled device, so as to realize the human-computer interaction between the operator and the App software interface of the controlled device.

Description

technical field [0001] The invention relates to an industrial human-computer interaction system and method based on visual and force-tactile augmented reality, belonging to the fields of human-computer interaction and industrial measurement control. Background technique [0002] At present, there are many human-machine interfaces in the industrial field, including industrial touch screens, instruments, etc. These human-machine interfaces are usually directly connected to the equipment, and each equipment needs a human-computer interaction interface. The system integration time is long, the cost is high, and the flexibility is low. [0003] The patent for invention with the publication number CN 107506037 A "A Method and Device for Controlling Devices Based on Augmented Reality". This technical solution detects that when the user's eyes focus on a device, it determines that the device is the device to be identified, and then the user uses the intelligent interaction method to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06F3/0354G06F3/038G06F3/04812G06F3/0484G06F3/0488G06F9/451
CPCG06F3/011G06F3/016G06F3/03543G06F3/0383G06F3/04812G06F3/0484G06F3/0488G06F9/451
Inventor 陈成军李东年于浩洪军井陆阳
Owner QINGDAO TECHNOLOGICAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products