Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Online man-machine interaction method based on the E-SOINN network

An E-SOINN, human-computer interaction technology, applied in the field of online human-computer interaction based on the E-SOINN network, can solve problems such as difficult large-scale neural network operations

Active Publication Date: 2018-02-23
JIANGNAN UNIV
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide an online human-computer interaction method based on the E-SOINN network in view of the defect that the existing gesture interaction method based on machine learning is difficult to realize large-scale neural network operations for mobile terminals with low performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Online man-machine interaction method based on the E-SOINN network
  • Online man-machine interaction method based on the E-SOINN network
  • Online man-machine interaction method based on the E-SOINN network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] Such as figure 1 An online human-computer interaction method based on the E-SOINN network is shown, which is mainly divided into two steps: training neural network and gesture interaction recognition.

[0042] 1. The steps to train the neural network are as follows:

[0043] 1. Obtain a gesture frame sequence from the video library, assuming that the total number of frames is n.

[0044] 2. Perform median filtering and denoising to all images in the frame sequence to improve robustness.

[0045] 3. Extract the i frame from the frame sequence, the i+1 frame, the i+2 frame image, denoted as: I i , I i+1 , I i+2 (the initial value of i is 1).

[0046] 4. Suppose the image is an RGB three-channel image, according to the image I i The average value of each of the three components of R G B determines the image I i The average gray value of the image, and then adjust the image I i The RGB value of each pixel such that the adjusted image I i The average values ​​of each ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an online man-machine interaction method based on the E-SOINN network. The method comprises the steps of acquiring gesture action video, acquiring a gesture outline image through an inter-frame difference method, training neural networks such as SOINN self-organizing increment neural network learning, collecting gesture videos through a mobile phone, obtaining a gesture outline image and achieving gesture judgment of gesture recognition through a network. The invention mainly aims at a low-cost mobile terminal development interactive gesture recognition method. The method is based on the inter-frame difference method and the E-SOINN self-organizing incremental neural network, a client acquires a gesture video, and gesture recognition can be realized through a network. A low-performance mobile terminal gesture recognition process is realized.

Description

technical field [0001] The invention belongs to the technical field of human-computer interaction, and in particular relates to an online human-computer interaction method based on an E-SOINN network. Background technique [0002] The development process of human-computer interaction is the process from people adapting to computers to computers gradually adapting to people. User interface stage, multi-channel and multimedia intelligent human-computer interaction stage. As the most flexible part of the body, the hand is the earliest tool used for physical communication between people. Dynamic gesture recognition is an important research topic in the field of human-computer interaction, which has important theoretical research significance and broad application prospects. [0003] The research on vision-based dynamic gesture recognition technology started relatively early in foreign countries. They have rich experience in this field and have achieved certain research results...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/02
CPCG06N3/02G06V40/113G06V40/28
Inventor 杨滨
Owner JIANGNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products