Supercharge Your Innovation With Domain-Expert AI Agents!

Hand detection tracking and musical instrument detection combined interaction method and system

An interactive method and interactive system technology, applied in the combined interactive method and system field of hand detection and tracking and musical instrument detection, can solve the problems of not being able to identify the playing area of ​​the musical instrument, and not being able to provide key point parameters of the hand

Pending Publication Date: 2021-07-23
杭州小伴熊科技有限公司
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this method is that it cannot identify the playing area of ​​the instrument and cannot provide the key point parameters of the hand

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hand detection tracking and musical instrument detection combined interaction method and system
  • Hand detection tracking and musical instrument detection combined interaction method and system
  • Hand detection tracking and musical instrument detection combined interaction method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0066] Such as figure 1 As shown, step 100 is executed to collect video and / or images using a collection device.

[0067] Step 110 is executed to generate a hand recognition model and a musical instrument detection model. The generation method of described hand recognition model comprises the following substeps:

[0068] Step 101: Configure camera parameters and collect musical instrument data in batches, and label them. The labeling method is to label N key points of the hand with X and Y values ​​according to the position of the hand in the image, where N is the number of key points.

[0069] Step 102: Process the collected images to generate estimated X and Y values. Compress the batches of collected images, perform normalization operations on the compressed images, use the MobileNet deep convolutional neural network to perform feature extraction to generate feature maps, use the average pooling layer to perform pooling operations on the feature maps, and use the average ...

Embodiment 2

[0081] Such as figure 2 As shown, an interactive system combining hand detection and tracking with musical instrument detection includes a collection device 200 , a training module 210 and a detection module 220 .

[0082] Collection device 200: for collecting video and / or images.

[0083] Training module 210: for generating a hand recognition model and a musical instrument detection model. The generation method of described hand recognition model comprises the following substeps:

[0084] Step 101: Configure camera parameters and collect musical instrument data in batches, and label them. The labeling method is to label N key points of the hand with X and Y values ​​according to the position of the hand in the image, where N is the number of key points.

[0085] Step 102: Process the collected images to generate estimated X and Y values. Compress the batches of collected images, perform normalization operations on the compressed images, use the MobileNet deep convolutiona...

Embodiment 3

[0096] The present invention can realize that hands can interact with actual objects in AR scenes, and can produce interactive effects such as sounds and animations. Implementation methods such as image 3 As shown, the technical scheme is as follows:

[0097] 1. Hand detection method (such as Figure 4 shown)

[0098] 1. Hand detection and tracking;

[0099] 2. Hand and object detection and tracking.

[0100] 3. Configure the camera parameters to ensure that the size of the captured photos is 480*480, collect musical instrument data in batches, and mark them. The labeling method labels the X and Y values ​​of the 21 key points of the opponent according to the position of the hand in the image.

[0101] 4. Compress the batch-collected images to a size of 256*256, perform normalization operations on the compressed images, and then use the MobileNet deep convolutional neural network to perform feature extraction to generate feature maps, and then use the average pooling lay...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a hand detection tracking and musical instrument detection combined interaction method and system, and the method comprises the steps of employing a collection device to collect videos and / or images, and also comprises the following steps: generating a hand recognition model and a musical instrument detection model; respectively inputting the collected videos into the hand recognition model and the musical instrument detection model for detection; and judging whether the hand key detection point position is in the musical instrument position to be identified or not by using a judgment rule. According to the hand detection tracking and musical instrument detection combined interaction method and system, the hand key points and the musical instrument key areas are detected, whether the hand key points are in the musical instrument key areas or not is judged, and then the system makes corresponding sounds according to the judgment result.

Description

technical field [0001] The present invention relates to the technical field of AR interaction, in particular to an interaction method and system combining hand detection and tracking with musical instrument detection. Background technique [0002] With the increasing integration of technology and entertainment content, audiences' demand for entertainment experiences has gradually shifted from single content and low-frequency interaction to more personalized, higher-quality, and more interactive experiences. Therefore, interactive videos have emerged, and audiences can click The interactive components that appear in the video, select the branch plot or perspective you want to watch. Compared with the traditional ones, the audience can passively watch the video, and the interactive video gives the audience a stronger sense of immersion and interaction. [0003] AR is a comprehensive integrated technology, involving computer graphics, human-computer interaction technology, sen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/32G06K9/62G06K9/46G06F3/01G06N3/04G06N3/08G06T3/40G06T7/246
CPCG06T7/246G06T3/4046G06N3/04G06N3/08G06F3/011G06V40/107G06V40/117G06V20/20G06V20/41G06V10/25G06V10/462G06F18/24
Inventor 段若愚史明韩钰浩
Owner 杭州小伴熊科技有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More