Gesture determination device and method, gesture-operated device, program, and recording medium

A judging device and gesture technology, applied in the direction of character and pattern recognition, input/output process of data processing, image data processing, etc., can solve the problem that it is difficult to distinguish the conscious action of the operator

Inactive Publication Date: 2016-03-09
MITSUBISHI ELECTRIC CORP
View PDF6 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, one of the problems with gesture operations is that it is difficult to distinguish between the operator's conscious actions (actions in which operation input is intentionally performed) and unconscious actions (actions in which operation input is performed unintentionally).

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture determination device and method, gesture-operated device, program, and recording medium
  • Gesture determination device and method, gesture-operated device, program, and recording medium
  • Gesture determination device and method, gesture-operated device, program, and recording medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach 1

[0038] figure 1 It is a diagram showing an example of use of the gesture operation device according to Embodiment 1 of the present invention. As shown in the figure, the gesture operation device 1 recognizes that the operation performed by the operator 3 is performed in a predetermined operation area 4 within the reach of the hand of an operator 3 sitting on a seat 2 such as a driver's seat, a passenger's seat, or a rear seat of the vehicle. The gesture provides operation instructions to a plurality of in-vehicle devices 6 a , 6 b , and 6 c as operated devices via the operation control unit 5 .

[0039] Next, assume a case where the operated devices are a map guidance device (navigation) 6a, an audio device 6b, and an air conditioner (air conditioning device) 6c. Operation instructions for the map guidance device 6 a , audio device 6 b , and air conditioner 6 c are performed through operation guidance displayed on the display unit 5 a of the operation control unit 5 , and ope...

Embodiment approach 2

[0204] Figure 14 It is a block diagram showing the configuration of the gesture operation device according to Embodiment 2 of the present invention. Figure 14 Gesture operating device shown with figure 2 The gesture control device shown is largely the same as the figure 2 The same reference numerals represent the same or corresponding parts, but the difference is that a mode control unit 18 and a memory 19 are added instead of figure 2 The coordinate system setting part 13 shown is provided with the coordinate system setting part 13a.

[0205] First, an overview of the device will be described.

[0206] The mode control unit 18 is supplied with mode selection information MSI from the outside, and outputs mode control information D18 to the coordinate system setting unit 13a.

[0207] The coordinate system setting unit 13a is supplied with the hand area information D12 from the hand area detection unit 12, and is supplied with the mode control information D18 from the ...

Embodiment approach 3

[0248] Figure 16 It is a block diagram showing the configuration of the gesture operation device according to Embodiment 3 of the present invention. Figure 16 Gesture operating device shown with figure 2 The gesture control device shown is largely the same as the figure 2 The same reference numerals indicate the same or corresponding parts.

[0249] Figure 16 Gesture operating device shown with figure 2 The gesture operation devices shown are substantially the same, but differ in that an operator estimation unit 20 is added, and an operation determination unit 17 a is provided instead of the operation determination unit 17 .

[0250] The operator estimating unit 20 estimates the operator based on one or both of the origin coordinates and the relative angle of the hand coordinate system output by the coordinate system setting unit 13 , and outputs operator information D20 to the operation determining unit 17 a. The estimation of the operator here may be, for example,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The hand region (Rh) of an operator is detected from a photographic image, the center (Po) of the palm of the hand and the center position (Wo) of the wrist are identified, the origin coordinates (Cho) of a coordinate system for the hand and the direction of a coordinate axis (Chu) of the coordinate system for the hand are set, and a feature amount (D15h) for the movement of the hand is calculated. Furthermore, the coordinate system for the hand is used to detect the shape of the hand in the hand region (Rh), and a feature amount (D14) is calculated for the shape of the hand. Furthermore, a feature amount (D15f) for the movement of a finger is calculated on the basis of the feature amount (D14) for the shape of the hand. A gesture is assessed on the basis of the calculated feature amounts. The assessment of a gesture takes into account differences in the direction in which a hand is moved, or differences in the angle of a hand placed in an operation region, so misrecognition of operations can be reduced.

Description

technical field [0001] The invention relates to a gesture judging device and method, and a gesture operating device. The present invention also relates to a program and a recording medium. Background technique [0002] In device operations of home appliances, in-vehicle devices, and the like, gesture operations based on the shape or movement of hands that enable operations without using a remote controller and without touching an operation panel are effective. However, one problem with the gesture operation is that it is difficult to distinguish between the operator's conscious actions (actions in which operation input is intentionally performed) and unconscious actions (actions in which operation input is performed unintentionally). In order to solve this problem, it has been proposed to set an operation area near the operator, and recognize only actions in the operation area as gestures consciously performed by the operator. Especially in an environment where the operato...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/20G06F3/01
CPCG06F3/017G06V40/113G06V40/28G06F18/22G06T7/20G06T7/60
Inventor 中村雄大山岸宣比古福田智教楠惠明
Owner MITSUBISHI ELECTRIC CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products