Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multimedia human-computer interaction method based on cam and mike

A human-computer interaction and camera technology, applied in the field of human-computer interaction, can solve the problems of high cost of synchronous action display, cumbersome control of computer display image actions, etc.

Inactive Publication Date: 2009-04-29
SOUTHEAST UNIV
View PDF0 Cites 44 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The technical problem to be solved by the present invention is: at present, the user's action control on the computer display image is cumbersome, and the cost of synchronous action display is high; aiming at the development direction of human-computer interaction, a convenient, fast, widely used and low-cost multimedia human interface is proposed. computer interaction method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multimedia human-computer interaction method based on cam and mike
  • Multimedia human-computer interaction method based on cam and mike
  • Multimedia human-computer interaction method based on cam and mike

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] The specific implementation of the method of the present invention is described below.

[0039] Assuming that the computer display image is a cartoon anthropomorphic image, taking the Logitech QuickCam Messenger camera as an example, there are three video sequence formats: ①640×480, 10 frames / second; ②320×240, 15 frames / second; ③160×120, 15 frames / second. frames per second.

[0040] Firstly, the color image acquired by the camera is used by the brightness formula

[0041] Y=0.299R+0.587G+0.114B

[0042] Convert to a grayscale image, and divide each frame of image into m×m pixel macroblocks. Taking the 320×240 sequence as an example, m=16 is desirable, then each frame of image has 20×15 macroblocks, such as figure 1 shown. For a macroblock in the kth frame image, the (m+2dx max )×(m+2dy max ) to search for the block that best matches it, dx max and dy max is the maximum displacement of the preset macroblock in the horizontal and vertical directions, such as fig...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method, comprising: a user head action image taken by a camera is processed, head motion vector is extracted, and an image displayed on a computer is controlled according to the motion vector to act synchronously with the user head; meanwhile, a user voice signal is detected by a microphone, the detected voice is used for controlling the mouth action of the image displayed on the computer, thus achieving living effect. The invention has low cost, convenient use and wide application, the cost takes one part out of ten thousand of the cost of an action capture system; by the method of the invention, the user hardly operates manually, and the computer display image and the action of the user can be displayed synchronously, so that the hands of the user and the attention can be relieved; the invention can be applied in a plurality of occasions, such as instant communication, distance education, multimedia teaching, electronic distorting mirror, three-dimensional graphic control, cartoon announcer / compere, interactive electronic pet, interactive dance robot, cell phone cartoon show, cartoon advertising reel, camera / microphone suit bundled software, etc.

Description

technical field [0001] The invention belongs to the technical field of human-computer interaction, and is a human-computer interaction method based on a camera and a microphone, which uses the camera and the microphone to obtain user control information, and controls a computer display image to perform corresponding actions. Background technique [0002] Human-computer interaction technology (Human-Computer Interaction Techniques) refers to the technology of realizing human-computer dialogue in an effective way through computer input and output devices. Multimedia human-computer interaction is based on eye tracking, speech recognition, gesture input, sensory feedback, etc. New interactive technology. With the development of science and technology, the ideal human-computer interaction can be carried out by people's daily skills without special training, and it is developing in this direction. [0003] For example, in the QQ2006 instant messaging software, a 3D animation show...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06T7/20G06T15/70G10L21/06G06T13/40
Inventor 陈阳吴乐南
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products