Gesture recognition method and apparatus with improved background suppression

A background interference and gesture technology, applied in image data processing, instrument, character and pattern recognition, etc., to avoid misjudgment

Active Publication Date: 2014-02-12
PIXART IMAGING INC
7 Cites 1 Cited by

AI-Extracted Technical Summary

Problems solved by technology

[0004] In other words, if the user swings his hand from left to right, the resulting center of gravity positions P1-P7 of the object will appear as follows: F...
View more

Method used

[0053] In summary, the gesture determination method and device of the present invention have at least the following advantages. Firstly, the present invention can use the average brightness of objects at different times to adjust the moving vectors at different times, thereby avoiding misjudgment by the gesture judging mechanism. In addition, the present invention can also use...
View more

Abstract

A gesture recognition method with improved background suppression includes the following steps. First, a plurality of images are sequentially captured. Next, a position of at least one object in each of the images is calculated to respectively obtain a moving vector of the object at different times. Then, an average brightness of the object in each of the images is calculated. Finally, magnitudes of the moving vectors of the object at different times are respectively adjusted according to the average brightness of the object in each of the images. There is further provided a gesture recognition apparatus using the method mentioned above.

Application Domain

Image analysisCharacter and pattern recognition

Technology Topic

Brightness perceptionBackground suppression +2

Image

  • Gesture recognition method and apparatus with improved background suppression
  • Gesture recognition method and apparatus with improved background suppression
  • Gesture recognition method and apparatus with improved background suppression

Examples

  • Experimental program(1)

Example Embodiment

[0041] The foregoing and other technical content, features, and effects of the present invention will be clearly presented in the following detailed description of the preferred embodiment with reference to the drawings. The directional terms mentioned in the following embodiments, for example: up, down, left, right, front or back, etc., are only directions for referring to the attached drawings. Therefore, the directional terms used are used to illustrate but not to limit the present invention.
[0042] figure 2 It is a schematic diagram of a gesture judgment device according to an embodiment of the present invention. Please refer to figure 2 The gesture judgment apparatus 200 of this embodiment includes an image acquisition unit 210 and a processing unit 220. The image acquisition unit 210 may be a Complementary Metal-Oxide-Semiconductor (CMOS) image sensor or a charge-coupled device (Charge-coupled Device, CCD) image sensor, wherein the image acquisition unit 210 is adapted to sequentially acquire a plurality of images . Specifically, the gesture judgment device 200 of this embodiment may include a light-emitting element 230, wherein the light-emitting element 230 is adapted to provide the light beam L1, so that when an object (such as a human hand) approaches the gesture judgment device 200, the image The acquisition unit 210 can receive the light beam L1 reflected by the object, and then can sequentially acquire multiple images with the movement of the object, as described above Figure 1A-Figure 1E The hand moving images I1-I5 are shown. In this embodiment, the light beam L1 of the light-emitting element 230 is infrared light as an example, but it is not limited to this. The light-emitting element 230 may also provide light beams L1 of other invisible light wavelength bands. In practice, in order to obtain an image of the object reflected light beam L1, the image acquisition unit 210 also needs to use a waveband image sensor that can detect the light beam L1.
[0043] Please continue to refer figure 2 The processing unit 220 of this embodiment may use a digital signal processor (DSP) or other suitable processors to perform image processing and analysis, wherein the processing unit 220 is adapted to receive the image data acquired by the image acquisition unit 210. Image and perform subsequent processing and calculations. Specifically, the processing unit 220 usually first calculates the position of the object (for example, the hand gradually approaching/away from the image acquisition unit 210) in each image, thereby obtaining the movement vector V1 of the object at different times. -V6, as mentioned above Figure 1F The gravity center positions P1-P7 of the drawn object and the movement vectors V1-V6 calculated according to the gravity center positions P1-P7.
[0044] Generally, when calculating the position of an object, the center of gravity position P1-P7 of the object is calculated according to the average brightness value of the object, and the movement vectors V1-V6 at different times are calculated. However, if the image acquired by the image acquisition unit 210 has a background At brightness (meaning: when the image acquired by the image acquisition unit 210 includes at least a background image), the center of gravity of the object will easily fall on the same point before and after the hand is waved, such as Figure 1F The drawn object's center of gravity position P1, P7 and displacement Δx=0. In other words, the center of gravity of the object P1-P7 with the user's hand waving from left to right will appear as Figure 1F In the trajectory shown, since the displacement Δx=0, the processing unit 220 may determine that the user's gesture is a circle-drawing action, which may cause a misjudgment. In this embodiment, in order to avoid the aforementioned misjudgment, the concept of weight is added to the movement vector, thereby generating a new trajectory and the position of the object's center of gravity, which will be described in more detail below.
[0045] Generally speaking, when the user performs a gesture, the position of the hand is usually closer to the image acquisition unit 210 than the body. As a result, the image brightness of the hand will be brighter than that of the background image. In other words, in this embodiment, the aforementioned movement vectors V1-V6 can be adjusted by using the average brightness value of the object image as the weight to obtain the corrected object movement trajectory and the center of gravity position obtained according to the adjusted movement vector V21-V26 P21-P27, such as image 3 As shown.
[0046] In this embodiment, adjusting the size of the movement vector V21-V26 is positively correlated with the average brightness of the object in each of the images. The greater the average brightness of the object, the adjustment of the movement vector V21-V26 of the object at different times. The greater the proportion of the size, and the proportion can be a polynomial relationship. Specifically, when the object is brighter, the adjusted movement vector will be larger, and when the object is darker, the adjusted movement vector will be smaller. In this way, the new object's center of gravity position P21-P27 will be generated. Will be different from the original center of gravity position P1-P7, such as Figure 1F versus image 3 As shown. by image 3 It can be seen that when the user's hand is swung from left to right, since the object's center of gravity positions P21-P27 are obtained through the aforementioned weighting method of the movement vector, the trajectory of the object's center of gravity position P21-P27 will appear image 3 In the trace shown, since the displacement Δx>0, the processing unit 220 can determine that the user's gesture is swiping from left to right. Conversely, if the displacement Δx<0, it can be correspondingly determined that the user's hand is waving from right to left.
[0047] It should be noted that when the user's hand movement is a rotation gesture, since the movement trajectory will generally move on the same plane, the image brightness of the object will be roughly the same. Furthermore, the rotation gesture is compared with The gesture of waving left to right or waving from right to left is not easy to suddenly appear and approach the image acquisition unit 210 and then suddenly leave and move away from the image acquisition unit 210. Therefore, in this embodiment, the brightness is used as the weight to weight the movement vector. The above will not affect the judgment of the rotation gesture.
[0048] It is worth mentioning that the above embodiment uses the average brightness value of the object image to adjust the weights of the motion vectors V1-V6 as an example of implementation. However, in another embodiment, the weights of the movement vectors V1-V6 can also be adjusted according to the size of the object image, such as Figure 4 The adjusted movement track shown will be described in more detail below.
[0049] Specifically, when the user’s hand is swung from left to right, he usually approaches the image acquisition unit 210 and then leaves the image acquisition unit 210. Therefore, in this embodiment, the size of the object image can be used as a weight. Adjust the aforementioned movement vectors V1-V6 to obtain the corrected object movement trajectory and the center of gravity positions P31-P37 obtained according to the adjusted movement vectors V31-V36, such as Figure 4 As shown. Similarly, when the user's hand is swung from left to right, the object's center of gravity position P3 1-P37 is obtained through the aforementioned method of weighting the movement vector V1-V6 using the size of the object image, so the object's center of gravity The trajectory of positions P31-P37 will appear Figure 4 In the trace shown, since the displacement Δx>0, the processing unit 220 can determine that the user's gesture is swiping from left to right. Conversely, if the displacement Δx<0, it can be correspondingly determined that the user's hand is waving from right to left.
[0050] Similarly, when the user's hand movement is a rotation gesture, since the movement trajectory will generally move on the same plane, the image size of the object will be roughly the same. Moreover, the rotation gesture is compared to the left The gesture of waving right or waving from right to left is not easy to suddenly appear and approach the image acquisition unit 210 and then suddenly leave and move away from the image acquisition unit 210. Therefore, in this embodiment, the object size is used as a weight to weight the movement vector. It will not affect the judgment of the rotation gesture. It is worth mentioning that in this embodiment, the average brightness and size of the object can also be used as the weight for adjusting the movement vectors V1-V6.
[0051] Figure 5 It is a schematic diagram of a gesture judgment device according to another embodiment of the present invention. Please refer to Figure 5 The gesture judgment device 500 of this embodiment and the aforementioned gesture judgment device 200 adopt a similar concept. The difference between the two is: the gesture judgment device 500 of this embodiment further includes a distance measuring system 510, wherein the distance measuring system 510 measures objects The distance relative to the image acquisition device 210 at different times, and the processing unit 220 can adjust the weights of the movement vectors V1-V6 according to the distance of the object relative to the image acquisition device 210 at different times, so as to avoid the gesture judgment device 500 has a misjudgment. Among them, the weighted trajectory is similar to the aforementioned image 3 versus Figure 4 As shown, I will not repeat them here.
[0052] It should be noted that when the user's hand movement is a rotation gesture, since the movement trajectory will generally move on the same plane, the distance of the object relative to the image acquisition device 210 will be generally the same at different times. Moreover, the gesture of rotation is less prone to suddenly appearing and approaching the image acquisition unit 210 and then leaving and moving away from the image acquisition unit 210 than the gesture of waving from left to right or from right to left. Therefore, in this embodiment, the object is The distance relative to the image acquisition device 210 at different times is used as a weight to weight the movement vector, which does not substantially affect the determination of the rotation gesture.
[0053] In summary, the gesture judgment method and device of the present invention have at least the following advantages. First of all, the present invention can use the average brightness of objects at different times to adjust the movement vectors at different times, thereby avoiding misjudgment by the gesture judgment mechanism. In addition, the present invention can also use the average size or shape of the object at different times to adjust the movement vector at different times to achieve the above-mentioned purpose. In addition, the present invention can also use the distance of the object from the image acquisition device as the weight for adjusting the movement vector at different times, and can also avoid the misjudgment of the gesture judgment mechanism.
[0054] The above-mentioned embodiments are only preferred embodiments of the present invention, and cannot be used to limit the scope of implementation of the present invention. That is, most of the simple equivalent changes and modifications are made according to the scope of the claims of the present invention and the description of the invention. All of them are still within the scope of the invention patent. In addition, any embodiment of the present invention or the scope of the claims does not need to achieve all the objectives or advantages or features disclosed in the present invention. In addition, the abstract part and title are only used to assist in searching for patent documents, and are not used to limit the scope of rights of the present invention.

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.

Similar technology patents

Warning information handling method and device

ActiveCN103281209Aavoid misjudgment
Owner:DATANG MOBILE COMM EQUIP CO LTD

Failure detecting method for relay in inverter

InactiveCN108303644Aavoid misjudgment
Owner:JIANGSU EVER SOLAR NEW ENERGY CO LTD

Marking-assisted answer sheet

InactiveCN104859344AMapping is accurateavoid misjudgment
Owner:秦健

Classification and recommendation of technical efficacy words

  • avoid misjudgment

On-line diagnosis and control method for knockings of internal-combustion engine

InactiveCN102735395Aavoid misjudgmentOvercome the Neglect of the Frequency Characteristics of the Pressure in the Cylinder
Owner:TIANJIN UNIV

Integrated dynamic weighing system for rectifying illegal driving and method

ActiveCN103852147AEliminate non-return to zero horizontal force in one directionavoid misjudgment
Owner:BEIJING WANJI TECH

Short circuit determination method for electrode fused arc welding

ActiveCN101513689Aavoid misjudgmentWelding state is stable
Owner:DAIHEN CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products