Supercharge Your Innovation With Domain-Expert AI Agents!

Human-computer interaction device and method used for target tracking

A technology of human-computer interaction and target tracking, which is applied to color TV parts, TV system parts, image data processing, etc. It can solve the problems of low precision, bulky size, high loss rate of tracking targets, etc., and achieve high precision , Low tracking loss rate

Active Publication Date: 2013-04-03
SHENZHEN INST OF ADVANCED TECH
View PDF4 Cites 53 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The above-mentioned patents and the existing target tracking systems are generally fixed devices, which are relatively bulky, poor in mobility, and have a limited range of trac

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-computer interaction device and method used for target tracking
  • Human-computer interaction device and method used for target tracking
  • Human-computer interaction device and method used for target tracking

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0047] see figure 1 with figure 2 as shown, figure 1 It is a schematic front view of the human-computer interaction device for target tracking in Embodiment 1 of the present invention, figure 2 It is a schematic top view of the human-computer interaction device for target tracking according to Embodiment 1 of the present invention. This embodiment provides a human-computer interaction device for target tracking, which includes a helmet device 1, on which a first information processing unit 2, a display device 3, a line of sight tracking system 4 and a plurality of cameras are arranged. 5. The camera 5 is installed around the helmet device 1 and connected to the first information processing unit 2, and shoots the scene video around the helmet device 1 in real time and transmits the captured scene video to the first information processing unit 2. There are at least four cameras 5, respectively Installed on the front, back, left, and right sides of the helmet device 1, in t...

Embodiment 2

[0053] The difference between this embodiment and Embodiment 1 is that in this embodiment, the gaze tracking system 4 does not have a second information processing unit 43, and the function of the second information processing unit 43 in Embodiment 1 is passed through the first information in this embodiment. The processing unit 2 is implemented, that is, the first information processing unit 2 and the second information processing unit 43 are integrated together, which reduces the cost of the device. Other parts that are the same as or similar to Embodiment 1 will not be repeated here.

[0054] see Figure 4 As shown, it is a structural block diagram of the human-computer interaction device in Embodiment 2 of the present invention. The gaze tracking system 4 includes an infrared light source 41 and a camera 42 . The infrared light source 41 turns on and off alternately. The camera 42 is connected to the first information processing unit 2 , and takes video frames of altern...

Embodiment 3

[0057] see image 3 As shown, it is a flow chart of the human-computer interaction method for target tracking in Embodiment 3 of the present invention.

[0058] This method can be implemented on the human-computer interaction device for target tracking in Embodiment 1 or 2. The method comprises the steps of:

[0059] Step S1: Shoot the video of the scene around the helmet device in real time.

[0060] In this step, the video of the scene around the helmet device is captured in real time through multiple cameras, for example, 4 cameras.

[0061] Step S2: Process the scene video.

[0062] In this step, scene fusion and stitching is performed on the video image sequences captured by each camera to obtain a 360-degree panoramic video of the scene.

[0063] Step S3: displaying the processed scene video on the display screen.

[0064] In this step, the processed 360-degree panoramic video is displayed on the glasses-type micro-display.

[0065] Step S4: By tracking the user's ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of target tracking and provides a human-computer interaction device used for target tracking. The device comprises helmet equipment. A first information-processing unit, a display device, a sight tracing system and a plurality of cameras are arranged on the helmet equipment. The cameras are used for shooting the scenes around the helmet equipment in real time and transmitting the shot scenes to the first information-processing unit; the first information-processing unit receives and processes the scene video and sending the processed scene video to the display device; the display device is used for displaying the processed scene video; and the sight tracing system determines the target in the scene which a user chooses to track, locks the target and performs real-time tracking by tracking the eye sight of the user. The invention also provides a human-computer interaction method for target tracking. The device and the method can select the target which needs to be tracked freely in real time and can detect and track the target object in 360 degrees without blind spots. The loss rate of tracking is low and the precision is high.

Description

technical field [0001] The invention relates to the technical field of target tracking, in particular to a human-computer interaction device and method for target tracking. Background technique [0002] Most of the existing target tracking devices are fixed camera devices, which detect and track target objects in real time from a single angle through the rotation of the fixed camera device. For example, images in the scene are acquired through a fixed camera or a network camera, and the collected image and video data are stored by a processor, and specific algorithms are used to process and analyze the acquired images in real time to quickly detect and track target objects . This kind of target tracking needs to rely on the selection of the target object, and the selection conditions need to be set in advance through the software. However, in many cases, it may not be possible to detect and track the target object in real time due to the angle problem, or the wrong target ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/20H04N5/232H04N7/18
Inventor 程俊陶大程陈裕华姜军张子锐
Owner SHENZHEN INST OF ADVANCED TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More