Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method, device and system for fusion display of real object and virtual scene

A technology for virtual scenes and real objects, applied in the multimedia field, can solve the problems of inability to display real-time display on terminals, heavy CPU burden, etc., to save computing time, reduce CPU burden, and improve system speed.

Active Publication Date: 2017-01-04
福建凯米网络科技有限公司
View PDF7 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Because the hardware performance of the embedded system is lower than that of PC equipment, if the CPU is also required to perform the above-mentioned portrait extraction processing in the embedded solution, the burden on the CPU is too heavy to perform smooth real-time display on the display terminal

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method, device and system for fusion display of real object and virtual scene
  • Method, device and system for fusion display of real object and virtual scene
  • Method, device and system for fusion display of real object and virtual scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0058] Such as figure 1 As shown, the present invention provides a method for fused display of real objects and virtual scenes, which includes:

[0059] S100: obtaining images collected by the camera in real time;

[0060] S200: extract an object in the image; the object may be a portrait;

[0061] S300: Updating the object into the virtual scene in real time to obtain an updated picture.

[0062] The present invention realizes the real-time synthesis of objects and virtual scenes through the above solutions. In the present invention, the virtual scene includes a 3D virtual stage, a 3D virtual reality scene, or a 3D video.

[0063] The 3D virtual stage is a special case in the 3D virtual reality scene. The real stage is simulated by computer technology to achieve a three-dimensional and realistic stage effect.

[0064] 3D virtual reality scene technology is a computer simulation system that can create and experience a virtual world. It uses a computer to generate a 3D simu...

Embodiment 2

[0084] Such as figure 2 As shown, the present invention also provides a device for fusing and displaying real objects and virtual scenes, including:

[0085] Image acquisition unit 1: used to acquire images collected by the camera in real time;

[0086] Object extraction unit 2: for extracting objects in the image;

[0087] Picture updating unit 3: used to update the object into the virtual scene in real time to obtain the updated picture.

[0088] Through the above device, the extraction of the object in the image and the synthesis of the object and the virtual scene are realized.

[0089] The device for fusing and displaying a real object and a virtual scene according to the present invention further includes: a picture display unit 4 : used for updating and displaying the updated picture on a display terminal in real time. Through the picture display unit 4, the real-time updated picture can be watched on the display terminal.

[0090] The device for fusing and display...

Embodiment 3

[0100] Such as image 3 As shown, the present invention also provides an embedded device 10 for fused display of real objects and virtual scenes, including: a camera 20, a GPU 30 and a processor 40;

[0101] The camera 20 is used to collect images in real time;

[0102] GPU 30 is used to extract objects in the image;

[0103] The processor 40 is used to update the object in the virtual scene in real time to obtain an updated picture; update and display the updated picture on the display terminal in real time.

[0104] The processor 40 is further configured to: encode the picture to obtain video data; obtain audio data; package the audio data and video data to obtain audio and video data.

[0105] The embedded device of the present invention can be installed with an Android operating system, and uses the image processing function of the Android system to encode pictures and package audio data and video data, thereby greatly reducing the cost of the device.

[0106] In the em...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method, device and system for the fusion display of a real object and a virtual scene, and the method comprises the steps: obtaining an image collected by a camera in real time; extracting an object in the image; carrying out the updating of the object in the virtual scene, and obtaining an updated image. The method employs a built-in GPU of embedded equipment for matting operation, does not occupy the time of a CPU, and increases the system speed. Meanwhile, the method employs a processor in the embedded equipment for the coding of a synthesis image of a human image and the virtual scene, obtains video data, greatly reduces the size of video data through coding, facilitates the smooth network transmission of video data, and carries out the real-time and smooth display at other clients.

Description

technical field [0001] The present invention relates to the field of multimedia technology, in particular to a method, device and system for fusing and displaying real objects and virtual scenes. Background technique [0002] Virtual scene synthesis technology has been widely used in TV studio recording programs or film production, such as weather forecast programs and so on. The virtual scene synthesis technology is a technology that extracts the portrait in the solid color background collected by the camera, superimposes and synthesizes it with the rendered virtual scene background, and then outputs the synthesized picture. This technology is currently implemented on the PC computer platform, and it also needs to be equipped with a professional camera to capture video and input it to the PC. All the equipment is integrated and sold. There are many equipment and the price is high. Generally, it is only used in professional occasions, such as TV studios. [0003] Embedded s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N5/262H04N5/265
Inventor 郑远冯皓林鎏娟林剑宇刘灵辉
Owner 福建凯米网络科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products