Target position determination method and device based on multiple cameras and computer equipment

A target position, multi-camera technology, applied in the field of tracking and positioning, can solve the problems of low target position accuracy, small camera field of view, unpredictability, etc., to improve the accuracy of determination and improve real-time performance.

Pending Publication Date: 2020-09-01
GUANGZHOU HAIGE COMM GRP INC +1
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, the current method for determining the target position generally uses a visual tracking algorithm to detect the specified target object in the video captured by a single camera to obtain the target position of the target object in the video; however, the field of view of a single camera is relatively large. Small, and for the target object in a complex scene, its movement is not directional and unpredictable, that is, the target object is easy to exceed the camera's field of view, resulting in a low accuracy rate of the determined target position

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target position determination method and device based on multiple cameras and computer equipment
  • Target position determination method and device based on multiple cameras and computer equipment
  • Target position determination method and device based on multiple cameras and computer equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] In order to make the purpose, technical solution and advantages of the present application clearer, the present application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present application, and are not intended to limit the present application.

[0041] The multi-camera-based target position determination method provided by this application can be applied to such as Figure 1a shown in the application environment. Wherein, camera 110 (such as camera 110a, camera 110b...110n) communicates with server 120 through network; Figure 1b , on the principle that the field of view angles of adjacent cameras include a certain overlapping area, install multiple cameras to cover all the fields of view of the target object to be tracked. Specifically, refer to Figure 1a , the server 110 determines the target object position i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a target position determination method and device based on multiple cameras, computer equipment and a storage medium. The method comprises the steps of determining a target object position in a current video frame shot by a current camera through a pre-trained target tracking prediction classifier; if the distance between the target object position and the image edge of the current video frame is smaller than the preset distance, determining the projection position of the target object in the video frame shot by the adjacent camera according to the target object position, wherein the adjacent camera is the next camera adjacent to the current camera; determining a target search area image in the video frame according to the projection position; and inputting the target search area image into a target tracking prediction classifier to obtain a target position of the target object in the video frame. By adopting the method, the position of the target object is tracked and positioned through the plurality of cameras, so that the determination accuracy of the target position is improved.

Description

technical field [0001] The present application relates to the technical field of tracking and positioning, and in particular to a multi-camera-based target position determination method, device, computer equipment and storage medium. Background technique [0002] With the rapid development of computer network technology, digital equipment, and digital storage equipment, it is possible to track and locate target objects (such as people, animals, etc.) through cameras installed in fixed places, unmanned vehicles, drones, etc. Determine the target location of the target object. [0003] However, the current method for determining the target position generally uses a visual tracking algorithm to detect the specified target object in the video captured by a single camera to obtain the target position of the target object in the video; however, the field of view of a single camera is relatively large. Small, and for the target object in a complex scene, its movement is not direct...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/292G06T7/73G06K9/62
CPCG06T7/292G06T7/73G06T2207/10016G06F18/24
Inventor 车满强李树斌
Owner GUANGZHOU HAIGE COMM GRP INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products