Unlock instant, AI-driven research and patent intelligence for your innovation.

Database construction method, augmented reality fusion tracking method and terminal equipment

A technology of augmented reality and construction methods, which is applied in image data processing, still image data indexing, still image data retrieval, etc., can solve problems that cannot meet the actual needs of AR technology, achieve accurate and simple image comparison, and improve accuracy Effect

Active Publication Date: 2020-05-15
SHICHEN INFORMATION TECH SHANGHAI CO LTD
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to differences in the shooting environment, image size, exposure intensity, and / or shooting angle between the template image and the actual image, the current image feature point extraction method cannot meet the actual needs of AR technology.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Database construction method, augmented reality fusion tracking method and terminal equipment
  • Database construction method, augmented reality fusion tracking method and terminal equipment
  • Database construction method, augmented reality fusion tracking method and terminal equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0063] Such as figure 1 As shown, the embodiment of the present invention provides a database construction method, including:

[0064] S1, collecting standard template images;

[0065] The terminal device turns on the camera, takes pictures of the template image, and obtains the template image, or can also use an existing image or a synthesized image stored in a computer or mobile phone as a template image. The so-called template image refers to the image of the object to be recognized. Images, such as plane images represented by books and magazine covers, can also be other images, such as portraits of people or animals, pictures of scenic spots, photos of natural scenery, etc. The aforementioned objects to be identified can also be Images that can be fused with virtual content, etc. The so-called standard template image refers to the image taken of the object to be recognized in an ideal environment, such as the flat image represented by the cover of a book or magazine, whic...

Embodiment 2

[0089] An embodiment of the present invention provides an augmented reality fusion tracking method, including:

[0090] 200. Construct a standard template image database. In the embodiment of the present invention, the method as described in Embodiment 1 is used to construct the standard template image database, and this embodiment will not repeat it;

[0091] 201. To collect images in a real scene, the image captured by the terminal device in a real scene may be called the first image, specifically:

[0092] The camera of the terminal device shoots images in the real scene. The images in the real scene can be specific objects such as books, tables and chairs, portraits of people or animals, natural scenery, business cards, pictures, etc. flat object. Images in real scenes may be affected by angle, size, lighting, etc. when shooting, and even in some cases, the image is partially occluded, so the first image will generally be higher than the standard in terms of pixels, size,...

Embodiment 3

[0118] Such as Figure 5 As shown, the embodiment of the present invention provides a terminal device, including a receiving unit 501, an image processing unit 503, an image matching unit 505, and a tracking fusion unit 507, wherein:

[0119] The receiving unit 501 is configured to receive a first image, where the first image is an image in a real scene captured by the terminal device when performing fusion tracking;

[0120] The image processing unit 503 is configured to form W+1 layers of images after scaling the first image W times, wherein W is a natural number; extract feature points on each layer of images, and count each feature point in each layer of images The number of occurrences above, and the feature point with the number of occurrences U is called the third strong feature point, and the U is a natural number less than or equal to W+1;

[0121] The image matching unit 505 calculates the feature vector of the third strong feature point, compares the feature vector...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The augmented reality fusion tracking method provided by the present invention includes receiving a first image, which is an image in a real scene captured by a terminal device when performing fusion tracking; after scaling the first image W times, a W+1 layer image; extract the feature points on each layer image, and count the number of times each feature point appears on each layer image, and the feature point with the number of occurrences U is called the third strongest feature point; calculate the Compare the feature vector of the third strong feature point with the feature vector of the feature point of the standard template image pre-stored in the database, and determine the feature vector of the third strong feature point. The feature vector of the standard template image that matches the feature vector of the information; determine the augmented reality content corresponding to the standard template image according to the standard template image corresponding to the standard template image vector, and compare the augmented reality content with the The first image is used for fusion tracking.

Description

technical field [0001] The invention relates to the field of augmented reality technology, in particular to a database construction method, an augmented reality fusion tracking method and a terminal device. Background technique [0002] AR (Augmented Reality, Augmented Reality) technology is a brand-new human-computer interaction technology. It applies virtual content to the real world through smart terminal equipment and visualization technology, so that the virtual content and the real world can be superimposed on the same screen or space at the same time and presented to the audience. user. With the popularity of smart terminals, the application of AR technology is becoming more and more extensive, and AR applications can be installed on smart terminals to experience. Specifically, the workflow of the AR application is as follows: the smart terminal captures image frames through the camera, recognizes the image frames, and determines the AR target object; tracks the AR t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/51G06T3/40G06T3/00
CPCG06F16/51G06T3/40G06T3/14
Inventor 张小军刘力王伟楠涂意
Owner SHICHEN INFORMATION TECH SHANGHAI CO LTD