A feature point localization method based on hybrid reality

A technology of feature point positioning and mixed reality, applied in the field of image recognition and medical image processing

Inactive Publication Date: 2018-12-28
HEILONGJIANG TUOMENG TECH CO LTD
View PDF12 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The purpose of the present invention is to provide a feature point positioning method based on mixed reality to solve the problem of matching the mixed reality model and patient identity information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A feature point localization method based on hybrid reality
  • A feature point localization method based on hybrid reality
  • A feature point localization method based on hybrid reality

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0030] A method for extracting facial image feature point data comprises:

[0031] Step 1. Use 90 feature points to locate the face. The feature points are distributed as follows: 18 points mark the mouth, 14 points mark the jaw, 12 points mark the eyes, 6 points mark the eyebrows, and 4 points mark the cheeks and gills 10 dots mark the nose, 4 dots mark the nape of the neck, 10 dots mark the ears, and 12 dots mark the hair. Such as figure 1 shown.

[0032] Step 2: Establish a Cartesian coordinate system, draw up any origin, record the coordinate information of feature points, and construct point cloud data. The point cloud data is a coordinate array with 90 rows and 3 columns, and the distribution of the 3 columns corresponds to the x, y, and z coordinate values, and each row represents a different feature point.

[0033] Step 3: Convert the point cloud data into a QR code for storage, which is convenient for mixed reality devices to scan and identify.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a feature point positioning method based on mixed reality. The method comprises the following steps: (1) recording point cloud data by collecting human body feature points; (2) generating a two-dimensional code from the characteristic point cloud data; (3) obtaining feature point cloud model by recognizing two-dimensional code with the camera of the hybrid reality device;(4) the similarity between the human body and the model being judged by comparing the characteristics of the model with the model on the hybrid reality device, registering and calculating the model error. The invention constructs point cloud data of human body characteristic points, then generates two-dimensional code identifiable by hybrid reality equipment, and then compares the two-dimensionalcode with a model stored in the hybrid reality equipment, so as to distinguish whether the model is information data of the human body or not, and solve the problem of matching the hybrid reality model with the patient identity information.

Description

technical field [0001] The invention belongs to the fields of image recognition and medical image processing, and relates to a feature point positioning method based on mixed reality. Background technique [0002] The Mixed Reality device is Microsoft's first holographic computer device that is not limited by cables, allowing users to interact with digital content and interact with holographic images in the real environment around them. At present, preoperative simulation planning and cross-space remote surgical interaction can be realized through mixed reality, and then precise surgery can be implemented, which greatly reduces the risk of surgery. When the same set of mixed reality equipment serves multiple patients, the technology needs to deal with the problem of matching the mixed reality model with the patient's identity information. If it cannot be matched correctly, it will bring unpredictable results. Therefore, the present invention solves this problem through a fe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/33G06K9/00G06K9/62
CPCG06T7/344G06T2207/30201G06T2207/10028G06V40/168G06F18/22
Inventor 邱兆文张健
Owner HEILONGJIANG TUOMENG TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products