Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for obtaining position of human body interesting area relative to screen window

A technology of area of ​​interest and human body area, applied in the field of obtaining the position of the area of ​​interest of the human body relative to the screen window, can solve the problems of inconvenient use, high cost, sensitive sensor position, etc., and achieve the effect of simple human-computer interaction

Inactive Publication Date: 2013-09-25
ANHUI UNIVERSITY OF TECHNOLOGY
View PDF4 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In the field of human-machine interface technology, more and more attention is paid to the method of obtaining the position of the human body's region of interest. The traditional method uses stereo vision or depth sensors, which are costly and sensitive to the sensor position, or require special equipment, which is inconvenient to use.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for obtaining position of human body interesting area relative to screen window
  • Method for obtaining position of human body interesting area relative to screen window

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] Embodiment 1: Applied to somatosensory games. The area of ​​interest is the palm, and single-color gloves can be worn. The color requirements are quite different from the clothing and indoor background, so that the palm area can be accurately and quickly obtained. The video window can be overlapped with the screen window so that the palm can move around the entire screen. The game program can set actions according to the position of the palm. Of course, any part of the body can be used as the region of interest, as long as the color of the clothing and the background have a large difference.

Embodiment 2

[0027] Embodiment 2: Applied to human hands to control TV. Same as the above example, the ROI is the palm. The palm features are extracted as the centroid position and the minimum circumscribed rectangle area. When the palm is open and clenched, the corresponding minimum circumscribing rectangle area is very different, which can be identified as two states. The application generates corresponding control actions based on these two states, such as opening the palm to indicate selection, and clenching to indicate execution.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for obtaining the position of a human body interesting area relative to a screen window, and belongs to the technical field of a man-machine interface. The method includes the steps of, firstly, building a background model, then detecting human body areas, collecting images including a human body by a camera, obtaining the human body areas through the background difference and threshold value method, obtaining a human body interesting area, learning and obtaining the characteristics of the human body interesting area through a machine, detecting the human body areas, obtaining the interesting area, obtaining the position of the human body interesting area relative to the screen window, and adopting the coordinate transformation method to obtain the relative position according to the position and the size of a video window, relative to the screen window, collected by the camera, wherein the video window can be located at different positions of the screen window according to application requirements and can be zoomed in size. The method for obtaining the position of the human body interesting area relative to the screen window is particularly suitable for the indoor environment with fixed scenes.

Description

Technical field: [0001] The invention belongs to the technical field of man-machine interface, and in particular relates to a method for obtaining the position of a human body interest region relative to a screen window. Background technique: [0002] In the field of human-machine interface technology, more and more attention is paid to the method of obtaining the position of the human body's region of interest. The traditional method uses stereo vision or depth sensors, which are costly and sensitive to sensor positions, or require special equipment, which is inconvenient to use. The invention adopts a monocular camera, and the position of the area of ​​interest of the human body relative to the screen window can be obtained without the need for the user to wear special equipment. Invention content: [0003] The present invention aims at the above-mentioned problems existing in the prior art, and provides a method for obtaining the position of the ROI of the human body re...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01H04N5/225
Inventor 单建华佘慧莉
Owner ANHUI UNIVERSITY OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products