Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target identification method and device

A target recognition and target technology, applied in the field of human-computer interaction, can solve the problem of high probability of misrecognition, and achieve the effect of improving accuracy and reducing the probability of misrecognition

Active Publication Date: 2012-06-20
TCL CORPORATION
View PDF3 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The embodiment of the present invention provides a target recognition method, which aims to solve the problem that the existing target recognition method has a high probability of misrecognition when the target action is recognized

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target identification method and device
  • Target identification method and device
  • Target identification method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0031] figure 1 A flow chart of the target recognition method provided in the first embodiment of the present invention is shown, and the target recognition method provided in this embodiment includes:

[0032] Step S11, synchronously acquire a sequence of depth frames and a sequence of color frames, the sequence of depth frames is a plurality of depth images collected by a depth camera.

[0033] In this embodiment, the depth camera is used to collect depth maps greater than 10 frames per second, usually 25 frames per second, and each depth map is numbered sequentially, and these multiple depth maps with incremental numbers form a depth sequence of frames.

[0034] In this embodiment, a common camera or a depth camera is used to collect color images, the frequency of collecting color images is the same as the frequency of collecting depth images, and it is the same scene as the collected depth images, and each color image is also numbered sequentially, A sequence of color fr...

Embodiment 2

[0052] figure 2 The flow of the target recognition method provided by the second embodiment of the present invention is shown. In this embodiment, step S12 of the first embodiment is mainly described in more detail. The target recognition method provided by the second embodiment mainly includes:

[0053] Step S21, synchronously acquiring a sequence of depth frames and a sequence of color frames, the sequence of depth frames is a plurality of depth images collected by a depth camera.

[0054] In this embodiment, the execution process of step S21 is the same as the execution process of step S11 in the first embodiment above, and the description will not be repeated here.

[0055] Step S22 , respectively acquiring the depth value groups in the preset region of interest ROI in the previous and subsequent depth frames, and calculating the difference between the depth value groups of the previous and subsequent depth frames.

[0056] In this embodiment, the depth value group in th...

Embodiment 3

[0072] image 3 It shows the process flow of the target recognition method provided by the third embodiment of the present invention. In this embodiment, the limb target selected is a human hand. This embodiment mainly focuses on the step S15 of the first embodiment, the step S26 of the second embodiment and the first embodiment Step S16 of step S16, step S27 of embodiment two are described in more detail:

[0073] Step S31, synchronously acquiring a sequence of depth frames and a sequence of color frames, the sequence of depth frames is a plurality of depth images collected by a depth camera.

[0074] Step S32, according to the changes in the depth values ​​of the adjacent front and rear depth maps in the depth frame sequence, it is judged whether there is an activation action in the preset region of interest in the next depth map, if not, go to step S33, if yes , execute step S34.

[0075] Step S33, continue monitoring.

[0076] In this embodiment, the execution process o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention is applicable to the electronic field and provides a target identification method and a device. The method comprises the following steps of: judging whether a starting action exists in ROI (region-of-interest) of a back depth map according to variation of depth value of a front depth map and the back depth map which are adjacent in a depth frame sequence; detecting regions which have the same colour frames according to a preset limb target model, and judging regions which accord with the limb target model to be limb target regions; storing characteristic set parameters of the limb target regions; tracking regions of the previous colour frame which are judged to be the limb target regions in a depth frame, and detecting the regions which are the same in the colour frame corresponding to the preset colour frame by utilizing the preset limb target model and characteristic set parameters of the previous limb target region which are stored, so as to obtain the limb target regions; acquiring coordinates of each limb target region, and identifying a target action according to the acquired coordinate values. In the target identification method and device provided by the embodiment of the invention, a depth image sequence and a colour image sequence are used for detecting the limb target regions, thus detection accuracy is effectively improved.

Description

technical field [0001] The invention belongs to the technical field of human-computer interaction, and in particular relates to a target recognition method and equipment. Background technique [0002] In an existing human-computer interaction device (such as a smart TV), a user can perform simple human-computer interaction with the smart TV. In the human-computer interaction device, target action recognition technology is often involved, and the human-computer interaction device responds to the user's control by recognizing the target action (such as gesture). [0003] Existing target recognition methods often identify human body movements through two-dimensional image processing, but because two-dimensional images are difficult to fully and accurately reflect actual things, it is difficult for this method to accurately distinguish targets and recognize human body movements. The probability of misidentification is high. Contents of the invention [0004] An embodiment of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/32
Inventor 李相涛
Owner TCL CORPORATION
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products