Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Machine Vision Behavior Intention Prediction Method Applied to Intelligent Buildings

A technology of machine vision and intelligent building, applied in the direction of instruments, computer parts, computing, etc., can solve the problem that the prediction method cannot accurately identify and predict user behavior in real time, achieve high accuracy, improve the degree of intelligence, and daily The effect of activity convenience

Active Publication Date: 2022-05-06
盈嘉互联(北京)科技有限公司 +6
View PDF16 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the above problems, the purpose of the present invention is to propose a machine vision behavior intention prediction method applied to intelligent buildings to solve the problem that existing behavior prediction methods cannot accurately identify and predict user behavior in real time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Machine Vision Behavior Intention Prediction Method Applied to Intelligent Buildings
  • A Machine Vision Behavior Intention Prediction Method Applied to Intelligent Buildings
  • A Machine Vision Behavior Intention Prediction Method Applied to Intelligent Buildings

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] see figure 1 , figure 2 , the present embodiment provides a machine vision behavior intention prediction method applied to intelligent buildings, including the following steps:

[0040] S1. First build a pedestrian detection model, use computer vision technology to judge whether there are pedestrians in the video image sequence and give precise positioning, and collect pedestrian pictures, then use the residual network to extract features from pedestrian pictures, and use multi-scale detection module detection Pedestrians of different scales, then the algorithmic full connection layer of the residual network is based on the prior frame regression to output the bounding box, confidence and category probability of pedestrian detection, and obtain the pedestrian detection result;

[0041] The pedestrian detection model is trained based on the COCO data set, and the pedestrian data set in the multi-type target data set is extracted through the script to obtain the pre-trai...

Embodiment 2

[0078] The method of the present invention is tested and verified, and the video data of behaviors such as entering and exiting the conference room and switching lights are collected, wherein entering the door a1 is a key action, touching tables and chairs a2, a3 is an irrelevant action, and the light changes from dark to bright after turning on the light. The environmental state changes s1;

[0079] Through the study of a video (including 6 times of a1 and s1, several times of a2 and a3), cluster the action set before each environmental state change s1, cluster the key action as a1, and use the prediction method to analyze another video For prediction, a total of 6 people entered the meeting room, and the prediction was performed when the key action a1 occurred and the light was dark, which shows that the method of the present invention can meet the requirement of behavior prediction accuracy.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a machine vision behavior intention prediction method applied to intelligent buildings, which includes the following steps: pedestrian detection, pedestrian tracking, establishment of space-time operators for action description, action detection and time boundary definition, environmental state change detection, and key action aggregation. category and behavior prediction; the present invention learns historical video data without relying on artificially set rules, and has relatively small limitations. Analyze and establish connections between behaviors and environmental state changes, predict the environmental state changes that need to be executed when key actions occur, and output corresponding prediction signals for automatic execution when key actions occur in the video, with high accuracy, and Meeting the real-time requirements can improve the intelligence of intelligent buildings, reduce human manual operations to a certain extent, and bring convenience to people's daily activities.

Description

technical field [0001] The invention relates to the technical field of behavior prediction, in particular to a machine vision behavior intention prediction method applied to intelligent buildings. Background technique [0002] Nowadays, people have higher and higher requirements for the quality of life, and due to the rapid development of science and technology in recent years, more and more intelligent buildings have begun to appear to meet the various needs of users and improve the user experience. Quality of life, when the user has the corresponding behavior, the corresponding intelligent building can provide corresponding functions to meet the needs of the user, so as to realize the intelligent living environment, and the realization of this intelligence generally relies on behavior prediction technology , Behavior prediction is to analyze the behavior of people in the video. Through a period of video sequence learning, the connection between behavior and environmental s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V20/40G06V40/20
Inventor 周小平王佳郑洋
Owner 盈嘉互联(北京)科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products