Method for recognizing visual motions in virtual assembly sitting operation

A technology of motion recognition and virtual assembly, applied in the field of somatosensory interaction

Inactive Publication Date: 2018-08-17
JILIN UNIV
View PDF9 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, there is no relevant technology that can automatically recognize visual movements, and the above content is only some of the guidelines that need to be followed when human-made recognition of movements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for recognizing visual motions in virtual assembly sitting operation
  • Method for recognizing visual motions in virtual assembly sitting operation
  • Method for recognizing visual motions in virtual assembly sitting operation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] A visual action recognition method in a virtual assembly sitting operation, the detection method comprising the following steps:

[0055] Step 1. Collect bone and eye coordinates, and construct an eye center coordinate system.

[0056] refer to figure 1 , the user stands facing the Kincet V2 on the left, and straightens the left arm, and collects the coordinate point A of the left eye through the depth camera in the Kinect V2 human-computer interaction device 1 (x 1 ,y 1 ,z 1 ), right eye coordinate point A 2 (x 2 ,y 2 ,z 2 ), calculate the coordinates of the center of the eyes Collect 16 skeletal points of the human body, including head and neck A 4 (x 4 ,y 4 ,z 4 ), shoulder center, left thumb, right thumb, left fingertip, right fingertip, left hand, right hand, left wrist, right wrist, left elbow A 5 (x 5 ,y 5 ,z 5 ), right elbow, left shoulder A 6 (x 6 ,y 6 ,z 6 ), right shoulder, hip center A 7 (x 7 ,y 7 ,z 7 ). First make the relative off...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of motion sensing interaction, and in particular relates to a method for automatically recognizing visual motions in virtual assembly. The method comprises a step 1of acquiring bone and eye coordinates, and constructing an eye center coordinate system; a step 2 of setting an eye activity range and marking a visual object; a step 3 of confirming that the hands have no operation; a step 4 of determining the field of view angle of each target object to mark an object on vertical and horizontal planes, and then correcting the field of view angle according to therotation state of the head; a step 5, according to the values of a vertical plane view angle [alpha] and a horizontal plane view angle [beta], recognizing a visual motion. The method utilizes a Kinect V2 device to collect the body coordinate, establish the eye center coordinate system, and set the eye activity range. After it is confirmed that there is no operation, the field of view angle of marked object on the vertical and horizontal planes can be obtained according to the neck movement position and the head rotation, and finally the visual motion can be determined based on a motion range.

Description

technical field [0001] The invention belongs to the field of somatosensory interaction, and specifically relates to a visual motion recognition method in virtual assembly sitting operation for automatic recognition of visual motion in virtual assembly. Background technique [0002] The model value calculation method is to measure the time according to the performance movements of different parts of the body, and visual movements are part of it. It stipulates that the eyes move in order to see things clearly, and every time one of the actions is done, it will be represented by E2. [0003] Eyes are important sensory organs and play a guiding role in people's actions. When the hand is moving, it is generally necessary to look at the position of the object instantaneously to control the speed and direction of the hand. This kind of eye movement is usually given before or during the movement, such as reading a document, looking for a mark on a picture, looking at the position ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/20
CPCG06V40/28G06V10/22
Inventor 姜盛乾于新李雨洋张昕莹王炳辉陈雪纯黄卓徐杨张开淦
Owner JILIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products