Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time man-machine interaction system in virtual scene

An interactive system and behavioral technology, applied in the field of human-computer interaction, can solve the problems of visual attention model error, neglect of influence, etc., and achieve the effect of high accuracy and fast speed

Active Publication Date: 2020-06-23
SHANGHAI JIAO TONG UNIV
View PDF7 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] At present, many user behavior prediction algorithms have been proposed in the field of human-computer interaction, but these algorithms are mostly based on the simple repetition of the user's previous operations, ignoring the impact of the scene content observed by the user on user behavior
And the research on the user's visual attention used to detect the user's attention to the scene content is mostly aimed at smooth changes in the scene, which leads to errors in these visual attention models in the case of sudden changes in the scene content.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time man-machine interaction system in virtual scene
  • Real-time man-machine interaction system in virtual scene
  • Real-time man-machine interaction system in virtual scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] Such as figure 1 As shown, the input of this system is a sequence of RGB video frames, and the position of the user's viewpoint is calculated by the visual attention area prediction module, which is superimposed on the current video frame and input to the behavior prediction module to predict user behavior.

[0019] Such as figure 2 As shown, the described visual attention area prediction module includes: a target detection unit, a smooth motion detection unit, a mutation detection unit, a saliency map generation unit and a feature extraction unit, wherein: the smooth motion detection unit adopts Flownet as a computing network to accept the target detection unit The bounding box information provided modifies the calculation of motion optical flow, and the target detection unit uses YOLO v3 as the calculation network; the mutation detection unit includes: motion mutation detector and color mutation detector, and the motion mutation detector accepts smooth motion detecti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a real-time man-machine interaction system in a virtual scene. The system comprises a visual attention area prediction module used for mutation detection and a behavior prediction module based on visual attention area characteristics. Wherein the visual attention area prediction module receives an input video frame sequence, carries out target information detection, smoothmotion detection and mutation information detection in sequence to obtain a visual saliency map, and carries out visual attention area extraction on the visual saliency map to obtain an attention areamap; and the behavior prediction module is used for predicting user behaviors by utilizing the characteristics after carrying out characteristics on the user visual area and the video content. According to the invention, the feedback behavior of the user after observing the video is predicted by inputting the video content observed by the user, so that the method can better operate and can cope with a smoothly changing scene in the presence of sudden changes in the scene.

Description

technical field [0001] The present invention relates to a technology in the field of human-computer interaction, in particular to a real-time human-computer interaction system in a virtual scene. Background technique [0002] At present, many user behavior prediction algorithms have been proposed in the field of human-computer interaction, but these algorithms are mostly based on the simple repetition of the user's previous operations, ignoring the impact of the scene content observed by the user on user behavior. And the research on the user's visual attention used to detect the user's attention to the scene content is mostly aimed at smooth changes in the scene, which leads to errors in these visual attention models when the scene content suddenly changes. Contents of the invention [0003] Aiming at the above-mentioned deficiencies in the prior art, the present invention proposes a behavior prediction interaction system based on user visual content. By inputting the vid...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V20/42G06V20/41G06V20/46G06V2201/07Y02D10/00
Inventor 卞琛毓肖双九
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products