Unlock instant, AI-driven research and patent intelligence for your innovation.

Automated Proximity Discovery of Networked Cameras

a networked camera and automatic technology, applied in the field of automatic proximity discovery of networked cameras, can solve the problems of limited machine assistance available to interpret or detect relevant data in images, no mechanism to facilitate computing to easily interact, and fewer options to do, so as to enhance the capture of information

Pending Publication Date: 2022-03-31
SCENERA INC
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

Enables flexible and efficient processing of data from multiple cameras, improving object tracking and enabling applications like home security and frail individual care by automating the determination of camera positions and environments, reducing the need for manual updates and human monitoring.

Problems solved by technology

There generally is no mechanism to enable computing to easily interact in a meaningful way with content captured by different cameras within a network.
There is limited machine assistance available to interpret or detect relevant data in images and even fewer options to do so for images captured by different cameras in proximity to each other.
This results in most data from cameras not being processed in real time and, at best, captured images are used for forensic purposes after an event has been known to have occurred.
Another problem today is that the processing of information is highly application specific.
As a result, the development of applications that make use of networks of sensors is both slow and limited.
For example, surveillance cameras installed in an environment typically are used only for security purposes and in a limited way.
However, as cameras are added, removed or repositioned, the software may have to be manually updated to take account for these changes in the physical installation.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automated Proximity Discovery of Networked Cameras
  • Automated Proximity Discovery of Networked Cameras
  • Automated Proximity Discovery of Networked Cameras

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027]The figures and the following description relate to preferred embodiments by way of illustration only. It should be noted that from the following discussion, alternative embodiments of the structures and methods disclosed herein will be readily recognized as viable alternatives that may be employed without departing from the principles of what is claimed.

[0028]FIGS. 1A-1D illustrate an example of automated discovery of camera proximity. In this example, a network of cameras 1-4 view a physical environment that includes rooms 1-3. Camera 1 views room 1, camera 2 views room 2, camera 3 views a doorway between rooms 2 and 3, and camera 4 views room 3. However, the relative positioning of the rooms and the cameras is not known. An synchronization service 110 receives TimeLines for the cameras. A TimeLine is a sequence of time stamped data relating to the camera's view, typically including images acquired by the camera. By comparing these TimeLines, the synchronization service 110 ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Automated discovery of the relative positioning of a network of cameras that view a physical environment. The automated discovery is based on comparing TimeLines for the cameras. The TimeLines are time-stamped data relating to the camera's view, for example a sequence of time stamps and corresponding images captured by a camera at those time stamps. In one approach, the relative positioning is represented by a proximity graph of nodes connected by edges. The nodes represent spaces in the physical environment, and each edge between two nodes represents a pathway between the spaces represented by the two nodes.

Description

CROSS REFERENCE TO RELATED APPLICATIONS[0001]This application is a continuation of U.S. application Ser. No. 15 / 900,489, “Automated Proximity Discovery of Networked Cameras,” filed Feb. 20, 2018, which is incorporated by reference in its entirety.BACKGROUND1. Technical Field[0002]This disclosure relates generally to obtaining information about networks of sensor devices, including for example cameras.2. Description of Related Art[0003]Millions of cameras and other sensor devices are deployed today. There generally is no mechanism to enable computing to easily interact in a meaningful way with content captured by different cameras within a network. Human monitoring is often required to make sense of captured videos. There is limited machine assistance available to interpret or detect relevant data in images and even fewer options to do so for images captured by different cameras in proximity to each other. This results in most data from cameras not being processed in real time and, a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G06K9/00G06T11/20G06T7/292
CPCG06K9/00758G06T7/292G06T11/206H04N7/181G06V20/48
Inventor LEE, DAVID D.WAJS, ANDREW AUGUSTINERYU, SEUNGOHLIM, CHIEN
Owner SCENERA INC