Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body behavior recognition method in man-machine cooperation assembly scene

A human-machine collaboration and recognition method technology, applied in the field of human behavior recognition, can solve the problems of low behavior recognition accuracy, easy recognition errors, and low assembly efficiency.

Pending Publication Date: 2022-04-26
TAIZHOU UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, the accuracy of behavior recognition is not high and the recognition speed is not fast, and it is easy to make mistakes in recognition, resulting in low assembly efficiency.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body behavior recognition method in man-machine cooperation assembly scene
  • Human body behavior recognition method in man-machine cooperation assembly scene
  • Human body behavior recognition method in man-machine cooperation assembly scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] Such as figure 1 As shown, a human-machine collaborative assembly scene human behavior recognition method, which includes,

[0036] Step 1: Set up two somatosensory devices, the angle between the two somatosensory devices is θ, and obtain the joint point coordinate flow of the skeletal joints under human behavior from the somatosensory devices;

[0037] Step 2: Screen the joint point coordinate stream with complete skeletal joints, and locate the start position and end position of the action according to the motion event segmentation algorithm to obtain the joint point information;

[0038] Step 3, the joint point information is resampled according to the included angle θ to obtain the joint point coordinates;

[0039] Step 4: Use the coordinates of spinebase (joint 0) as the origin of the local coordinates to normalize the coordinates of other joint points, and then smooth to obtain a sequence of bones that constitute an action.

[0040] Step 5: Simplify the vectors ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body behavior recognition method in a man-machine cooperation assembly scene. The method comprises the following steps: acquiring a joint point coordinate flow of skeleton joints under human body behaviors from a somatosensory device; a joint point coordinate flow with complete skeleton joints is screened, and the starting position and the ending position of the action are positioned according to an action event segmentation algorithm to obtain joint point information; joint point information is subjected to resampling angle change according to the included angle theta to obtain joint point coordinates; normalizing coordinates of other joint points, smoothing to obtain a skeleton sequence forming an action, simplifying vectors of adjacent joint points of the upper limbs to obtain vector directions of the upper limbs, respectively calculating included angles between the vector directions of the left upper limb and the right upper limb and the vertical direction, and dividing the scene into a left hand scene or a right hand scene through the included angles; respectively training human body behavior recognition in a left hand scene and a right hand scene; and fusing the human body behavior output of the left hand scene and the right hand scene to realize behavior recognition in the man-machine cooperation scene.

Description

technical field [0001] The present invention relates to the technical field of human body behavior recognition, in particular to a human body behavior recognition method in a human-machine collaborative assembly scene. Background technique [0002] The recognition method is applied in a simple scene and a single type of human-machine collaborative assembly environment. Taking the human-computer collaborative assembly of a chair as an example, the human is the leader in the assembly, and the robot passes the chair accessories to the person as an assistant ( Such as chair legs) and assembly tools (such as hexagonal wrench), man-machine only needs to cooperate with a few steps to complete the assembly task. [0003] However, the accuracy of behavior recognition is not high and the recognition speed is not fast, and it is easy to make mistakes in recognition, resulting in low assembly efficiency. Contents of the invention [0004] The present invention aims to solve one of th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V40/20G06V40/10G06N3/04G06N3/08G06V10/82
CPCG06N3/08G06N3/045G06N3/0464A61B5/1122A61B5/1123
Inventor 陈鹏展李芳
Owner TAIZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products