Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

AR model training method and device, electronic device and storage medium

A technology of AR model and training method, which is applied in the field of image processing, can solve the problems of long acquisition period, AR model does not change with time, and high labor cost, so as to achieve the effect of ensuring accuracy

Active Publication Date: 2019-12-20
BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Using the existing AR model training method, the labor cost is too high, the acquisition cycle is long, and the trained AR model does not have the ability to change over time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • AR model training method and device, electronic device and storage medium
  • AR model training method and device, electronic device and storage medium
  • AR model training method and device, electronic device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0057] figure 1 It is a schematic flowchart of the AR model training method provided in Embodiment 1 of the present application. The method can be executed by the AR model training device or the background service device, and the device or the background service device can be implemented by software and / or hardware. The device or background service equipment can be integrated in any intelligent equipment with network communication function. like figure 1 As shown, the training method of the AR model may include the following steps:

[0058] S101. Receive the first shot image and the second shot image of the target point of interest POI at the first shooting position and the second shooting position respectively sent by the user through the client device, and the world of the first shooting position and the second shooting position coordinate.

[0059] In a specific embodiment of the present application, the background service device may receive the first captured image and ...

Embodiment 2

[0068] figure 2 It is a schematic flowchart of the AR model training method provided in Embodiment 2 of the present application. like figure 2 As shown, the training method of the AR model may include the following steps:

[0069] S201. Receive the first shot image and the second shot image of the target point of interest POI at the first shooting position and the second shooting position respectively, and the world of the first shooting position and the second shooting position sent by the user through the client device coordinate.

[0070] In a specific embodiment of the present application, the background service device may receive the first captured image and the second captured image of the target point of interest POI at the first shooting position and the second shooting position respectively sent by the user through the client device, and The world coordinates of the first camera position and the second camera position. Specifically, the client device may first a...

Embodiment 3

[0083] Figure 5 It is a schematic flowchart of the AR model training method provided in Embodiment 3 of the present application. like Figure 5 As shown, the training method of the AR model may include the following steps:

[0084] S501. Obtain the location area where the user is located.

[0085] In a specific embodiment of the present application, the client device may acquire the location area where the user is located. Exemplarily, the location and area is a circular area centered on the location of the user and with a preset length as the side length; or the location area may also be an area of ​​other regular shape, which is not limited here.

[0086] S502. If it is detected that the user has entered the recognizable range according to the location area where the user is located, image acquisition is performed on the target POI at the first shooting position and the second shooting position in the location area, and the first shooting image and the first shooting ima...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an AR model training method and device, an electronic device and a storage medium, and relates to the field of computer vision. According to the specific implementation scheme,the method comprises the steps of receiving a first shot image and a second shot image which are sent by a user through client equipment and aim at a target POI at a first shooting position and a second shooting position respectively, and world coordinates of the first shooting position and the second shooting position; respectively extracting a first group of screen coordinates and a second group of screen coordinates of the target POI from the first shot image and the second shot image; calculating world coordinates of the target POI according to the first group of screen coordinates, the second group of screen coordinates and the world coordinates of the first shooting position and the second shooting position; and training an AR model of the target POI according to the world coordinates of the target POI. According to the embodiment of the invention, the labor cost and the time cost can be effectively reduced, and the trained AR model has the capability of changing along with time.

Description

technical field [0001] The present application relates to the technical field of image processing, and further relates to computer vision technology, especially a training method, device, electronic equipment and storage medium of an AR model. Background technique [0002] AR (Augmented Reality, Augmented Reality) is a technology that calculates the position and angle of camera images in real time and adds corresponding images. In other words, AR is to display virtual information in reality and allow people to interact with virtual information. AR can seamlessly connect reality and virtual information through technical means. Construct a three-dimensional scene to display things that do not exist in reality, and connect with real life. [0003] In the existing AR model training methods, photos are usually collected manually, regardless of spring, summer, autumn, winter, sunny, rainy, etc.; and then the collected photos are classified and identified as N Key points: train ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/00G06T7/70
CPCG06T19/006G06T7/70
Inventor 朱婧思宋鹏程邓国川江志磊
Owner BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products