The present invention is a method and
system for automatically analyzing the behavior of a person and a plurality of persons in a
physical space based on measurement of the trip of the person and the plurality of persons on input images. The present invention captures a plurality of input images of the person by a plurality of means for capturing images, such as cameras. The plurality of input images is processed in order to track the person in each
field of view of the plurality of means for capturing images. The present invention measures the information for the trip of the person in the
physical space based on the processed results from the plurality of tracks and analyzes the behavior of the person based on the trip information. The trip information can comprise coordinates of the person's position and temporal attributes, such as trip time and trip length, for the plurality of tracks. The
physical space may be a retail space, and the person may be a customer in the retail space. The trip information can provide key measurements as a foundation for the behavior analysis of the customer along the entire shopping trip, from entrance to checkout, that deliver deeper insights about the customer behavior. The focus of the present invention is given to the automatic behavior analytics applications based upon the trip from the extracted video, where the exemplary behavior analysis comprises map generation as
visualization of the behavior, quantitative category measurement, dominant path measurement, category correlation measurement, and category sequence measurement.