Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

1621 results about "Facial expression" patented technology

A facial expression is one or more motions or positions of the muscles beneath the skin of the face. According to one set of controversial theories, these movements convey the emotional state of an individual to observers. Facial expressions are a form of nonverbal communication. They are a primary means of conveying social information between humans, but they also occur in most other mammals and some other animal species. (For a discussion of the controversies on these claims, see Fridlund and Russell & Fernandez Dols.)

Photo Automatic Linking System and method for accessing, linking, and visualizing "key-face" and/or multiple similar facial images along with associated electronic data via a facial image recognition search engine

ActiveUS20070172155A1Quick searchEnhanced and improved organization, classification, and fast sorts and retrievalDigital data information retrievalCharacter and pattern recognitionHealth professionalsWeb crawler
The present invention provides a system and method for input of images containing faces for accessing, linking, and or visualizing multiple similar facial images and associated electronic data for innovative new on-line commercialization, medical and training uses. The system uses various image capturing devices and communication devices to capture images and enter them into a facial image recognition search engine. Embedded facial image recognition techniques within the image recognition search engine extract facial images and encode the extracted facial images in a computer readable format. The processed facial images are then entered for comparison into at least one database populated with facial images and associated information. Once the newly captured facial images are matched with similar “best-fit match” facial images in the facial image recognition search engine's database, the “best-fit” matching images and each image's associated information are returned to the user. Additionally, the newly captured facial image can be automatically linked to the “best-fit” matching facial images, along with comparisons calculated, and/or visualized. Key new use innovations of the system include but are not limited to: input of user selected facial images for use finding multiple similar celebrity look-a-likes, with automatic linking that return the look-a-like celebrities' similar images, associated electronic information, and convenient opportunities to purchase fashion, jewelry, products and services to better mimic your celebrity look-a-likes; health monitoring and diagnostic use by conveniently organizing and superimposing periodically captured patient images for health professionals to view progress of patients; entirely new classes of semi-transparent superimposed training your face to mimic other similar faces, such as mimic celebrity look-a-like cosmetic applications, and or facial expressions; intuitive automatic linking of similar facial images for enhanced information technology in the context of enhanced and improved organization, classification, and fast retrieval objects and advantages; and an improved method of facial image based indexing and retrieval of information from the web-crawler or spider searched Web, USENET, and other resources to provide new types of intuitive easy to use searching, and/or combined use with current key-word searching for optimized searching.
Owner:VR REHAB INC +2

Method and system for measuring emotional and attentional response to dynamic digital media content

The present invention is a method and system to provide an automatic measurement of people's responses to dynamic digital media, based on changes in their facial expressions and attention to specific content. First, the method detects and tracks faces from the audience. It then localizes each of the faces and facial features to extract emotion-sensitive features of the face by applying emotion-sensitive feature filters, to determine the facial muscle actions of the face based on the extracted emotion-sensitive features. The changes in facial muscle actions are then converted to the changes in affective state, called an emotion trajectory. On the other hand, the method also estimates eye gaze based on extracted eye images and three-dimensional facial pose of the face based on localized facial images. The gaze direction of the person, is estimated based on the estimated eye gaze and the three-dimensional facial pose of the person. The gaze target on the media display is then estimated based on the estimated gaze direction and the position of the person. Finally, the response of the person to the dynamic digital media content is determined by analyzing the emotion trajectory in relation to the time and screen positions of the specific digital media sub-content that the person is watching.
Owner:MOTOROLA SOLUTIONS INC

Method of facial coding monitoring for the purpose of gauging the impact and appeal of commercially-related stimuli

InactiveUS7113916B1Market predictionsImpact scoreCrowds
A method of assessing consumer reaction to a marketing stimulus, involving the steps of (a) exposing a sample population to a marketing stimulus for a period of time, (b) interviewing members of the sample population immediately after exposure of the members to the marketing stimulus, (c) videotaping any facial expressions and associated verbal comments of individual members of the sample population during the exposure period and interview, (d) reviewing the videotaped facial expressions and associated verbal comments of individual members of the sample population to (1) detect the occurrence of action units, (2) detect the occurrence of a smile, (3) categorize any detected smile as duchenne or social smile, (4) detect the occurrence of any verbal comment associated with a detected smile, and (5) categorize any associated verbal comment as positive, neutral or negative, (e) coding a single action unit or combination of action units to a coded unit, (f) associating coded units with any contemporaneously detected smile, (g) translating the coded unit to a scored unit, (h) tallying the scored unit by scoring unit category, (i) repeating steps (d) through (h) throughout the exposure period, (j) repeating steps (d) through (h) for a plurality of the members of the sample population, (k) calculating an impact value for each scoring unit category by multiplying the tallied number of scored units for each scoring unit category by a predetermined impact factor for that scoring unit category, (l) calculating an appeal value for each scoring unit category by multiplying the tallied number of scored units for each scoring unit category by a predetermined appeal factor for that scoring unit category, (m) combining the impact values obtained for each scoring unit category to obtain an impact score, (n) combining the appeal values obtained for each scoring unit category to obtain an appeal score, and (o) representing the appeal and impact scores with an identification of the corresponding marketing stimulus to which the members were exposed.
Owner:SENSORY LOGIC

Classroom behavior monitoring system and method based on face and voice recognition

The invention discloses a classroom behavior monitoring system and method based on face and voice recognition. The method comprises the following steps: a camera acquires the video information of students and teachers in classrooms; voice recording equipment acquires the voice information of the students and teachers in the classrooms; a main control processor preprocesses the received video information of the students and teachers and extracts the facial expression features and behavior features of the students and teachers; the main control processor processes the received voice information of the students and extracts the voice features of the students; and the main control processor processes the received voice information of the teachers, extracts the voice features of the teachers, calculates the scores of the teaching effect of the teachers, evaluates the teaching effect of the teachers according to the scores, and provides guidance suggestions. According to the classroom behavior monitoring system and method disclosed by the invention, the classroom behaviors of the teachers and students in the classrooms are observed, and thus the accuracy and objectivity of the evaluation can be increased, the teaching methods can be improved, and the teaching quality can be increased.
Owner:SHANDONG NORMAL UNIV

Method and system for analyzing big data of intelligent advertisements based on face identification

The invention relates to the field of intelligent advertisement putting and data analysis, in particular to a method and system for analyzing big data of intelligent advertisements based on face identification. The method comprises the following steps: correlatively presetting, storing face image information in a local database, and distributing corresponding user identifiers; starting application, and displaying a face identification detection interface of which at least one part is used for displaying advertisement information; performing tracking identification on faces, and extracting the characteristic values of the faces; setting the threshold values of images and judging whether the threshold values of the images are matched or not; calling related data of the identifiers and pushing personalized media information; recording advertisement watching time, clicking, residence time and identified facial expressions, and uploading to a server; clustering and analyzing acquired multi-user information to obtain statistic data; adjusting pushing of advertisement information according to the statistic data. According to the method and the system, an advertisement pushing window is added to existing face detection equipment, and consumption habits or behaviors of groups can be obtained according to data analysis, thereby providing guidance for commercial activities.
Owner:宋柏君
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products