Game character action recognition and generation method based on neural network

A game character and neural network technology, applied in biological neural network models, animation production, neural architecture, etc., can solve the problems of high requirements for producers, independent and discontinuous data, and long time-consuming, and achieve simple and predictable 3D poses. Robust results, less sensitive to noisy data

Pending Publication Date: 2020-12-04
成都先知者科技有限公司
View PDF10 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The third method is manual production by professionals; this method requires professional animators to manually produce corresponding movements frame by frame, which takes a long time and is inefficient, and high-quality movements require high production staff
There are some approaches to predict 3D pose from a given set of 2D keypoints by exploiting image features and 2D ground truth, or by simply predicting their depth; traditional video pose estimation mostly works on single-frame data, which leads to The data between frames is independent and discontinuous

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Game character action recognition and generation method based on neural network
  • Game character action recognition and generation method based on neural network
  • Game character action recognition and generation method based on neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0044] Such as Figure 1 to Figure 4 As shown, the present embodiment provides a neural network-based game character action recognition generation method, which includes the following steps:

[0045] In the first step, the user obtains a video file containing human body movement, which can be taken by a mobile phone or downloaded from the Internet. This article uses a video camera.

[0046] In the second step, the user uploads the video file containing human body movement to the server.

[0047] In the third step, the server divides the video file uploaded by the user into continuous frame pictures according to a certain frame rate.

[0048] The fourth step is to input the continuous frames of pictures into the human body 2D key point detection neural network, the steps are as follows:

[0049] (1) Convert the picture to a uniform size.

[0050] (2) The 2D key point coordinate data of each frame of the human body is obtained through the human body key point detection neural...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a game character action recognition and generation method based on a neural network, and the method comprises the following steps: collecting a human motion video, and segmenting the human motion video into continuous frames of images according to a frame rate; adopting a human body key point neural network to detect and extract human body key point 2D coordinate data in the image of any sequence frame one by one; inputting the 2D coordinate data of the human body key points of any sequence frame into a posture prediction neural network to obtain corresponding 3D posture data; and binding the 3D posture data to the animation skeleton to obtain game character action data. By means of the scheme, the method has the advantages of being simple in logic, accurate in recognition, high in recognition effect, small in calculation workload and the like, and has high practical value and promotional value in the technical field of computer vision.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a neural network-based game character action recognition and generation method. Background technique [0002] In recent years, with the development and application of computer technology and AI artificial intelligence, human motion reconstruction technology based on computer vision has risen rapidly and received extensive attention. At present, human motion reconstruction based on computer vision is still a very challenging topic in computer vision, which involves image processing, pattern recognition, artificial intelligence and other disciplines, and has extensive experience in digital film, animation production, game development and other fields. Application prospects. [0003] This article is mainly applied to the production of game actions. Among them, the traditional game action production techniques mainly include the following three types: [0004] The first is ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T13/40G06K9/00G06N3/04
CPCG06T13/40G06V40/20G06N3/045
Inventor 廖健伟李阳林受颖周泽培袁晓敏
Owner 成都先知者科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products