Tracking device, tracking method, and tracking system

An information processing method and technology of a tracking device are applied in the fields of tracking devices and information processing, readable storage media, and electronic equipment, and can solve the problems of not reaching the accuracy of human movement trajectory and the like.

Pending Publication Date: 2020-11-10
TOSHIBA TEC KK
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, the accuracy of detecting the movement tr

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Tracking device, tracking method, and tracking system
  • Tracking device, tracking method, and tracking system
  • Tracking device, tracking method, and tracking system

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0044] figure 1 It is a block diagram showing the main circuit configuration of the tracking device 1 according to the first embodiment.

[0045] The tracking device 1 tracks the actions of the person 103 in the store 101 based on the detection result of the person 103 by the smart camera 102 provided for the store 101 of the camera shop 100 .

[0046] The smart camera 102 takes a moving image (video). The smart camera 102 determines an area where the person 103 appears in the captured moving image (hereinafter referred to as a recognition area). The smart camera 102 measures the distance from the smart camera 102 to the person 103 reflected in the captured moving image. Any method such as a stereo camera method or a ToF (Time of Flight: Time of Flight) method can be applied to the distance measurement method. Every time the smart camera 102 determines a new recognition area, it outputs detection data including area data specifying the recognition area and distance data ind...

no. 2 example

[0100] Figure 5 It is a block diagram showing the main circuit configuration of the tracking device 2 according to the second embodiment. In addition, in Figure 5 In pair with figure 1 The same elements shown are given the same symbols, and their detailed descriptions are omitted.

[0101] The tracking device 2 tracks the actions of the person 103 in the store 101 based on the detection results of the person 103 captured by each of the two smart cameras 102 installed for taking pictures of the store 101 of the store 100 .

[0102] The two smart cameras 102 are installed in such a manner that the directions of the imaging centers are different from each other and can simultaneously image one person 103 .

[0103] The hardware configuration of the tracking device 2 is the same as that of the tracking device 1 . The tracking device 2 differs from the tracking device 1 in that the main memory 12 or the auxiliary storage unit 13 stores an information processing program for pe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a tracking device, an information processing method, a readable storage medium, and electronic equipment. The tracking device can be used for tracking a target object more precisely. The tracking device includes an acquisition unit, a direction determination unit, and a position determination unit. The acquisition unit acquires, from a camera whose installation position andimaging center direction are known in a facility, region data specifying a region in which an existing object is reflected in an image captured by the camera, and distance data specifying a distancefrom a preset reference position to the object. The direction determination unit determines, on the basis of the area data acquired by the acquisition unit, the amount of shift between the direction from the camera to the object and the imaging center direction. The position determination unit determines the position of the object in the facility on the basis of the installation position of the camera, the imaging center direction, the amount of displacement determined by the direction determination unit, and the distance data acquired by the acquisition unit.

Description

[0001] This application claims the priority of the Japanese application with the filing date of May 9, 2019 and the application number JP2019-089091, and cites the content of the above application, which is incorporated herein by reference in its entirety. technical field [0002] Embodiments of the present invention relate to a tracking device, an information processing method, a computer-readable storage medium, and electronic equipment. Background technique [0003] Conventionally, a technique of tracking a person based on changes in an area where a person is reflected in an image captured by a camera is known. [0004] However, the image captured by the camera contains almost no information on the distance from the camera to the person, and the tracking does not take this information into account. Therefore, the accuracy of detecting the movement trajectory of a person in detail has not been achieved. [0005] Under such circumstances, there is a demand for a technology...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/20G06T7/70
CPCG06T7/20G06T7/70G06T2207/30196G06T2207/30232G06V20/52G06T7/50
Inventor 市川隆
Owner TOSHIBA TEC KK
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products