Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion sensing game interactive method and system based on deep learning and big data

A technology of deep learning and somatosensory games, applied in the field of human-computer interaction, can solve problems such as unfavorable popularization and promotion of somatosensory games, high recognition accuracy, complex configuration environment, etc., to achieve strong adaptive recognition ability, good somatosensory experience, and reduce costs Effect

Inactive Publication Date: 2017-10-24
深圳市泽科科技有限公司
View PDF5 Cites 46 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional electronic games require players to sit in front of the game equipment for a long time, which is not conducive to physical and mental health. In somatosensory games, players send all instructions to the game through their bodies, and the body dances with the game without any restraint, which greatly enriches the player's game immersion. sense, giving players an excellent somatosensory experience, and can also exercise while playing games and entertainment
[0003] At present, somatosensory games have the following implementation methods: 1. Microsoft's XBOX 360 uses kinect 3D cameras to collect player's body and bone information to identify players' body movements, which has high recognition accuracy but expensive equipment; 2. Wearable or handheld sensors Collect the player's body information to identify the player's body movements. This implementation requires the player to wear or hold the sensor, which may cause discomfort to the player and affect the player's experience
All in all, these somatosensory games have strict requirements on hardware equipment, complex configuration environment, and high game costs, which are not conducive to the popularization and promotion of somatosensory games.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion sensing game interactive method and system based on deep learning and big data
  • Motion sensing game interactive method and system based on deep learning and big data
  • Motion sensing game interactive method and system based on deep learning and big data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0041] Such as figure 1 As shown, the embodiment of the present invention provides a deep learning-based somatosensory game interaction method, including:

[0042] Step S101, collect body action videos in different games in advance, obtain an action video sample database, and add corresponding action tags to the sample data, and the action tags correspond to the game object control instructions one by one;

[0043] Step S102, perform in-depth learning training on the above-mentioned video sample data set, establish a deep convolutional neural network, backpropagate the error during training, and use the stochastic gradient descent method to update the network weight parameters, and finally make the loss function of the network reach a minimum value . The deep convolutional neural network is used to input the video of the player's body movements during the game, and output the game player's action prediction result; the game player's action prediction result includes action cl...

Embodiment 2

[0054] Such as figure 2 As shown, the embodiment of the present invention provides a method for online optimization of a deep learning network model based on big data, including:

[0055] Step S201, using the big data platform to collect the body movement video of the player during the game;

[0056] Step S202, preprocessing the video, sending it to the cloud server, and building an action video sample database in the cloud server;

[0057] Step S203, regularly using the video sample database to fine-tune the deep convolutional neural network model obtained through offline training, so as to further improve the recognition accuracy of the network;

[0058] Step S204, periodically updating the online fine-tuned network model to the somatosensory game interaction system, so that game players can obtain better somatosensory experience.

[0059] The embodiment of the present invention preprocesses the video, including removing frames irrelevant to the game player's operation co...

Embodiment 3

[0062] refer to image 3 and Figure 4 As shown, the embodiment of the present invention provides a somatosensory game interaction system based on deep learning and big data, including:

[0063] The deep convolutional neural network offline training module 301 is used for:

[0064] Deep learning training is performed on the training sample data set composed of the player's body movement videos during the game, and a deep convolutional neural network is established. During training, the error is backpropagated, and the network weight parameters are updated using the stochastic gradient descent method, and finally the loss function of the network reaches a minimum value. The deep convolutional neural network is used to input the action video of the game player, and output the action prediction result of the game player; the action prediction result of the game player includes action classification and probability distribution data thereof;

[0065] The real-time human-compute...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a motion sensing game interactive method and system based on deep learning and big data, and belongs to the technical field of man-machine interaction. The method includes the steps of I, collecting a motion video sample dataset; II, establishing a deep convolutional neural network model, and training the same offline; III, using the deep convolutional neural network model. The system comprises a deep convolutional neural network offline training module, a real-time man-machine interaction module, and a deep network model online optimization module based on big data. Common cameras are used in real time to acquire game operation videos of game players, high-level semantic characteristics of motion via are extracted via a deep convolutional neural network, limb movements are judged and converted into actual control data for a game target, and the game target can be controlled to move correspondingly the same as the player via the player's limbs.

Description

technical field [0001] The invention discloses a somatosensory game interaction method and system based on deep learning and big data, belonging to the technical field of human-computer interaction. Background technique [0002] Somatosensory game is a new type of electronic game that can be operated and felt through changes in the player's body movements. Compared with traditional electronic game systems that rely on mouse, keyboard, gamepad and other devices as interactive media, somatosensory games use the recognition of players' "body movements" as an interactive method. Traditional electronic games require players to sit in front of the game equipment for a long time, which is not conducive to physical and mental health. In somatosensory games, players send all instructions to the game through their bodies, and the body dances with the game without any restraint, which greatly enriches the player's game immersion. It gives players an excellent somatosensory experience,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06K9/00G06K9/62G06N3/08
CPCG06F3/011G06N3/084G06V40/20G06F18/214
Inventor 吕怡静刘伟平杜戈奚杭
Owner 深圳市泽科科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products