Unlock instant, AI-driven research and patent intelligence for your innovation.

AR interaction method and device

An interaction method and user technology, applied in the AR field, can solve the problem of too little interaction between images and users

Inactive Publication Date: 2017-07-07
SHENZHEN THINKSKY TECH
View PDF3 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The embodiment of the present invention provides an AR interaction method and device, aiming to solve the problem of too little interaction between the displayed image and the user in the existing AR method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • AR interaction method and device
  • AR interaction method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0023] figure 1 A flow chart of an AR interaction method provided by the first embodiment of the present invention is shown, and the details are as follows:

[0024] Step S11, displaying the 3D model image in the AR scene.

[0025] Specifically, start the AR application, scan the object (the object is usually a picture) in front of the camera device through the camera device (such as a camera or camera, etc.) in the AR scene, and compare the scanned object with the preset 3D model image , if the scanned object is the same as one of the preset 3D model images, the 3D model image corresponding to the object is displayed.

[0026] It should be pointed out that the 3D model image can be displayed dynamically, and during the display process, the corresponding voice can be emitted in conjunction with the action of the 3D model image.

[0027] Step S12, acquiring the information of the user's finger, where the information of the user's finger includes the position of the user's fin...

Embodiment 2

[0051] figure 2 It shows a structural diagram of an AR interaction device provided by the second embodiment of the present invention. The AR interaction device can be applied to various mobile terminals, and the mobile terminal can include a radio access network RAN ​​and one or more core The user equipment that communicates with the network, the user equipment can be a mobile phone (or called a "cellular" phone), a computer with a mobile device, etc. For example, the user equipment can also be portable, pocket, handheld, computer built-in or Mobile devices on board that exchange voice and / or data with the radio access network. For another example, the mobile device may include a smart phone, a tablet computer, a personal digital assistant PDA, a sales terminal POS, or a vehicle-mounted computer. For ease of description, only parts related to the embodiments of the present invention are shown.

[0052] The AR interaction device includes: an image display unit 21 , an inform...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention is suitable for the technical field of AR, and provides an AR interaction method and device. The method comprises the following steps of: displaying a 3D model image under an AR scene; obtaining information of a user finger, wherein the information of the user finger comprises a position and displacement of the user finger; judging whether the position of the user finger is on the 3D model image or not; and when the position of the user finger is on the 3D model image, moving the 3D model image according to the displacement of the user finger. Through the method, the interaction manner between the displayed 3D model image and the users is enriched, the interaction operations are simple and the operation usability is increased.

Description

technical field [0001] Embodiments of the present invention belong to the field of AR technologies, and in particular relate to an AR interaction method and device. Background technique [0002] Augmented reality (Augmented Reality, AR) is a technology that calculates the position and angle of the camera image in real time and adds the corresponding image. The goal of this technology is to display the virtual world on the screen in the real world. [0003] However, since the existing AR technology mainly presents images combined with the real world, there is little interaction with users. [0004] Therefore, a new technical solution needs to be proposed to solve the above problems. Contents of the invention [0005] Embodiments of the present invention provide an AR interaction method and device, aiming to solve the problem of too little interaction between displayed images and users in existing AR methods. [0006] The first aspect of the embodiments of the present inve...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06F3/0484G06F3/0488G06T19/00
CPCG06T19/006G06F3/011G06F3/0484G06F3/0488
Inventor 封林毅周雪松
Owner SHENZHEN THINKSKY TECH