Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body action mapping method and device, computer equipment and storage medium

A mapping method and human motion technology, applied in the field of automation, can solve the problems of complex calculation and time-consuming robot follow-up calculations, and achieve the effects of reducing calculation complexity, fast speed, and increasing calculation frequency

Active Publication Date: 2021-09-28
GUANGDONG INST OF ARTIFICIAL INTELLIGENCE & ADVANCED COMPUTING +2
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The embodiment of the present invention proposes a human body motion mapping method, device, computer equipment and storage medium to solve the problem that the robot follow-up calculation is relatively complicated and takes a long time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action mapping method and device, computer equipment and storage medium
  • Human body action mapping method and device, computer equipment and storage medium
  • Human body action mapping method and device, computer equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] figure 1 It is a flowchart of a human motion mapping method provided by Embodiment 1 of the present invention. This embodiment can be applied to the situation where IMU is used to collect the motion of joints specified by the user and calculate the posture, and map it to the machine joints of the robot in real time. The method It can be performed by a human body motion mapping device, which can be implemented by software and / or hardware, and can be configured in a computer device, such as a personal computer, server, workstation, etc., and specifically includes the following steps:

[0036] Step 101, read inertial data from multiple inertial measurement units.

[0037] In this embodiment, a plurality of inertial measurement units can be pre-configured. The inertial measurement unit is a sensor or a collection of sensors that measure data generated when the carrier moves (ie, inertial data). The inertial measurement unit includes the following types:

[0038] 1. Six-axi...

Embodiment 2

[0137] image 3 It is a flow chart of a human body motion mapping method provided by Embodiment 2 of the present invention. This embodiment is based on the foregoing embodiments, and further increases the operation of virtual reality human body motions. The method specifically includes the following steps:

[0138] Step 301, read inertial data from multiple inertial measurement units.

[0139] Wherein, a plurality of inertial measurement units are mounted on the user's body parts, and part of the user's body joints are spaced between two adjacent inertial measurement units.

[0140] Step 302, fusing the inertial data into the attitude of the inertial measurement unit in space.

[0141] Step 303, loading business scenarios.

[0142] Step 304 , in a business scenario, generate a virtual part matching the body part according to the pose.

[0143] In some business scenarios, you can start a 3D engine, such as Unity engine, Unreal Engine, Cocos2dx engine, self-developed engine, ...

Embodiment 3

[0170] Figure 4 A structural block diagram of a human body action mapping device provided in Embodiment 3 of the present invention may specifically include the following modules:

[0171] The inertial data reading module 401 is used to read inertial data from a plurality of inertial measurement units. The plurality of inertial measurement units are mounted on the body parts of the user, and the interval between two adjacent inertial measurement units is determined by Describe the user's body joints;

[0172] an attitude fusion module 402, configured to fuse the inertial data into the attitude of the inertial measurement unit in space;

[0173] A joint angle calculation module 403, configured to calculate the joint angles of the body joints according to the postures corresponding to two adjacent inertial measurement units;

[0174] A joint angle mapping module 404, configured to map the joint angles to machine joints of the robot;

[0175] The robot driving module 405 is co...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a human body action mapping method and device, computer equipment and a storage medium, and the method comprises the steps: in the embodiment, reading inertial data from a plurality of inertial measurement units, enabling the plurality of inertial measurement units to be attached to a body part of a user, and enabling a part of two adjacent inertial measurement units to be separated from a body joint of the user, and fusing the inertial data into postures of the inertial measurement units in the space, calculating joint angles of the body joints according to the postures corresponding to two adjacent inertial measurement units, mapping the joint angles to machine joints of the robot, driving the robot to move until the adjacent machine joints reach the joint angles, calculating the joint angle of the body joint by directly calculating the posture of the inertial measurement unit, and mapping the joint angle to the machine joint of the robot, so that the operation is simple, the calculation complexity is reduced, the calculation frequency is improved, the following degree of the robot is high, the corresponding speed is high, and the robot can be guided to realize various flexible actions.

Description

technical field [0001] The embodiments of the present invention relate to the technical field of automation, and in particular to a human body action mapping method, device, computer equipment and storage medium. Background technique [0002] With the development of science and technology, learning human actions, mapping human actions to robots, and having the robots execute the actions can be widely applied to multiple business scenarios. [0003] For example, in the business scenario of firefighting and disaster relief, fire-fighting robots can help firefighters carry out rescue work remotely; surgery or virtual surgery. [0004] The methods of extracting human motion mainly include optical motion capture, curvature sensor motion capture, and IMU (Inertial Measurement Unit, inertial measurement unit) motion capture. Motion capture requires sensors and joints to achieve high-precision fit, and the degree of freedom of measurement is also limited. Therefore, IMU motion cap...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G16H40/67G06T13/20B25J9/00B25J9/16
CPCG16H40/67G06T13/20B25J9/0081B25J9/1671
Inventor 刘天鑫蒿杰梁俊刘嘉瑞舒琳黄玉清叶渐豪
Owner GUANGDONG INST OF ARTIFICIAL INTELLIGENCE & ADVANCED COMPUTING
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products