Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Static gesture intention recognition method and system based on dynamic feature assistance and vehicle

A technology of dynamic features and recognition methods, applied in the field of vehicle control, can solve problems such as increased cost of components, inaccuracy, and unsuitable applications, and achieve the effects of improving gesture control performance, reducing misidentification, and improving recognition performance

Pending Publication Date: 2022-07-29
CHONGQING CHANGAN AUTOMOBILE CO LTD
View PDF2 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method combines the output information of the recognition network without increasing the hardware cost. Due to the scarcity and inaccuracy of the input information itself, the performance optimization effect is not good.
Another example is a three-dimensional gesture recognition device and three-dimensional gesture recognition method disclosed in CN201210565201.7. In this method, the method is optimized on the basis of adding a recognition device. The recognition device includes two infrared cameras, an image acquisition unit, an infrared light-emitting diode, Infrared light-emitting diode drive unit, computing unit, and gesture recognition unit. This solution uses infrared lighting and an infrared camera to solve visible light interference. The upward-facing camera captures images to reduce background interference and further improve the reliability of gesture recognition. In the automotive field, the increase in component costs is more sensitive, and the limitation of the camera angle in the car determines that the application of this method is not suitable

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Static gesture intention recognition method and system based on dynamic feature assistance and vehicle
  • Static gesture intention recognition method and system based on dynamic feature assistance and vehicle
  • Static gesture intention recognition method and system based on dynamic feature assistance and vehicle

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The present invention will be described in detail below with reference to the accompanying drawings.

[0031] like figure 1 As shown, in this embodiment, a method for recognizing static gesture intent based on dynamic feature assistance includes the following steps:

[0032] Step 1: Hand detection stage: acquire gesture image data, and detect the gesture image data. If the position of the hand is detected, output the position of the hand and go to Step 2, otherwise, repeat Step 1.

[0033] In this embodiment, the original gesture image data is input into the hand detector model, YOLO V3 is selected as the target detection network in the hand detection model, the detection category is the hand category, and the training data is used to determine the parameters of the convolutional neural network. Adjustment, and perform scene analysis for the samples of false detection and missed detection in the actual scene, iteratively complete the model training, the model detects a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a static gesture intention recognition method and system based on dynamic feature assistance and a vehicle. The method comprises the following steps: step 1, hand detection; the method comprises the following steps of: 1, extracting dynamic features, 2, hand tracking, 3, static gesture classification, 4, spatial position judgment, 5, continuous frame recognition, 6, dynamic feature extraction and judgment, 7, static gesture judgment output and 8, in-vehicle application event response. According to the method, the static gesture actions are classified based on the hand action information of the user, the static gesture intention of the user can be recognized more accurately by combining the forward hand action trend characteristics in the static gesture control process, the misrecognition situation caused by the fact that similar gestures are static and free of operation or unconscious random waving is reduced, and the user experience is improved. And the false triggering rate is reduced, so that the cockpit gesture control function experience of the user is improved.

Description

technical field [0001] The invention belongs to the technical field of vehicle control, and in particular relates to a method, a system and a vehicle for recognizing static gesture intention based on dynamic feature assistance. Background technique [0002] Cars are commonly used means of transportation. While ensuring safety, people also put forward higher requirements for convenience and comfort in the use of cars. With the continuous development and upgrading of interaction methods in the cockpit, from traditional physical buttons, to touch screen control, to voice control, to the increasingly widely used gesture control, the interaction methods are becoming more convenient and smarter. Gesture recognition has obvious advantages in scenes where voice control is inconvenient or awkward. At the same time, gesture applications that are natural and conform to user control logic can better reflect the sensorless interaction experience of the cockpit. Therefore, it is particul...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V40/20G06V10/82G06V10/774G06N3/08G06N3/04G06F3/01B60W50/08
CPCG06V40/28G06V10/774G06V10/82G06N3/08G06F3/017B60W50/08G06N3/045
Inventor 石林吴锐
Owner CHONGQING CHANGAN AUTOMOBILE CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products