Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multidimensional user identity identification method

A technology of user identification and user identity, applied in the field of smart home, can solve problems such as single identification dimension, inability to meet complex scenarios, and inability to effectively solve user authority control

Active Publication Date: 2017-04-26
常州百芝龙智慧科技有限公司
View PDF6 Cites 44 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Smart home appliances and smart security products can indeed bring convenience to people's lives, but current technical means still cannot effectively solve the problem of user access control
Traditional text passwords, fingerprints, etc., although highly secure, require active verification by the user. Due to the constraints of their input form, they still cannot be well applied in some daily scenarios.
However, forms of biometric codes such as face recognition and voice recognition can be passively recognized, but their recognition dimensions are single and cannot meet the needs of complex scenarios.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multidimensional user identity identification method
  • Multidimensional user identity identification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The technical implementation processes involved in the present invention will be described respectively below in conjunction with the accompanying drawings.

[0029] Face recognition: Users who need to obtain permission first take a photo of their face through the camera, and the photo will be stored in the sample library as a comparison sample. After capturing the user image that needs to determine the authority, it is firstly convolved with multiple Gabor filters of different scales and directions (the convolution result is called the Gabor feature map) to obtain a multi-resolution transformed image. Then divide each Gabor feature map into several disjoint local spatial regions, extract the brightness variation pattern of local neighborhood pixels for each region, and extract the spatial region histogram of these variation patterns in each local spatial region, The histograms of all Gabor feature maps and all regions are concatenated into a high-dimensional feature hi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multidimensional user identity identification method. The multidimensional user identity identification method comprises the steps of: detecting a human body in a photographic range by means of a camera, extracting facial features of the human body, comparing the facial features with user pictures prestored in a sample library, calculating a face matching coefficient, and preliminarily judging whether a user with permission exists in the photographic range; receiving user voice by using a microphone, converting an audio analog signal into a digital sequence, comparing the digital sequence with user voiceprints prestored in the sample library, calculating a voiceprint matching coefficient, and calculating the face matching coefficient and the voiceprint matching coefficient again to obtain a matching degree, so as to judge whether the user have the permission; and establishing a model for the user when judging that the user has the permission, carrying out human body dynamic tracking on the user, matching a voice source position with a position calculated through carrying out human body tracking on the user, judging that a command is issued by the user with permission and the command is valid when a voice position matches with an image position, and executing the command.

Description

technical field [0001] The invention belongs to the technical field of smart home, and in particular relates to a multi-dimensional user identification method. Background technique [0002] With the continuous development of society and the advancement of technology, people's longing for smart life makes smart home products more and more appear in daily life. Smart home appliances and smart security products can indeed bring convenience to people's lives, but current technical means still cannot effectively solve the problem of user access control. Traditional text passwords, fingerprints, etc., although highly secure, require active verification by the user. Due to the constraints of their input form, they still cannot be well applied in some daily scenarios. Although forms of biometric codes such as face recognition and voice recognition can be passively identified, their identification dimensions are single and cannot meet the needs of complex scenarios. [0003] The em...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06T7/215G10L17/00
CPCG10L17/00G06V40/172G06V10/446G06V10/50
Inventor 叶伟
Owner 常州百芝龙智慧科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products