Remote interaction method and system

A remote interaction and program technology, applied in the transmission system, the input/output process of data processing, instruments, etc., can solve the problems of large network bandwidth, large image data, and rigid interaction, and achieve the effect of reducing the bandwidth load

Inactive Publication Date: 2017-09-29
SAGACITY TECH
7 Cites 2 Cited by

AI-Extracted Technical Summary

Problems solved by technology

[0003] However, the existing remote interactive technologies, such as video conferences, electronic whiteboards or online teaching systems, have the problems of high bandwidth requirements and rigid interactions
The existing remote interaction technology must capture the data of each client, such as images, annotations or input, etc. together with the background, and then transmit the captured images to other users participating in the discussion. Th...
View more

Method used

[0038] The user can operate the user device 15, 17, 19 to perform input on its interactive program. In this embodiment, the user device 15 is used as an example for illustration. When the user who operates the user device 15 operates the user device 15 to make an input on its interactive program, the user device 15 can capture the input through its virtual surface and use the objectification module 153 to objectify the input. Wherein, the input can be a revision, an explanation, a note or an operation, but not limited thereto. The input objectification process may be a quantization action, but it is not limited thereto. For example, the input image or trajectory can be vectorized. The vectorized object can have the advantages of saving the least file size, arbitrarily scaling without distortion, and being convenient for transmission, and can manipulate its movement, rotation, scaling, and f...
View more

Abstract

The invention relates to a remote interaction method and system. Multiple user devices log in a same servo motor, and the servo motor recognizes and allots online between the user devices; a servo host or one of the user devices builds a sharing surface, and a sharing object is built on the sharing surface; the sharing object is transmitted to one or more specified user devices; the user devices execute an interaction program so as to build virtual surfaces respectively; when one user device conducts input on the interaction program, the input is captured through the virtual surface of the user device, and the input is objectified; the objectified input is transmitted to one or more specified user devices; the virtual surfaces of the user devices display the received sharing object and the objectified input on the interaction program in a superimposing mode. The aims of achieving remote interaction and reducing the bandwidth load are achieved.

Application Domain

TransmissionDigital output to display device

Technology Topic

Virtual surfaceUser device +2

Image

  • Remote interaction method and system
  • Remote interaction method and system
  • Remote interaction method and system

Examples

  • Experimental program(1)

Example Embodiment

[0028] Various exemplary embodiments will be described more fully below with reference to the accompanying drawings, in which some exemplary embodiments are shown. However, the inventive concept may be embodied in many different forms and should not be construed as being limited to the exemplary embodiments set forth herein. To be precise, the provision of these exemplary embodiments will make the present invention detailed and complete, and will fully convey the scope of the concept of the present invention to those skilled in the art. In the drawings, the sizes and relative sizes of layers and regions may be exaggerated for clarity. Similar numbers always indicate similar components.
[0029] It should be understood that, although the terms first, second, third, etc. may be used herein to describe various components or signals, etc., these components or signals should not be limited by these terms. These terms are used to distinguish one component from another, or one signal from another signal. In addition, as used herein, the term "or" may include any one or all combinations of more of the associated listed items as appropriate.
[0030] See figure 1 , Is a schematic diagram of an embodiment of the remote interactive system of the present invention. As shown in the figure, the remote interactive system 10 includes a server host 11, a network 13, and user devices 15, 17, and 19.
[0031] In this embodiment, three user devices 15, 17, 19 are taken as examples, but it is not limited thereto. Among them, the user devices 15, 17, 19 can be one of a desktop computer, a notebook computer, a smart phone and a tablet computer, but it is not limited to this. The user devices 15, 17, 19 can log in to the server host 11 via the network 13, and the server host 11 can identify and allocate connections between the user devices.
[0032] The server host 11 or one of the user devices 15, 17, 19 creates a shared object 101, and displays the shared object 101 on the user devices 15, 17, 19 respectively.
[0033] The user can watch or operate the shared object 101, and can input on an interactive program of the user device. The user device can objectify the input and transmit the objectified input directly to one or more designated user devices. Alternatively, the objectified input can be sent to the server host 11 first, and the server host 11 can then send the objectified input to the designated user device, and display the objectified input on the user device. For example, such as figure 1 As shown, the user device 15 inputs an input 103 (such as nice), the user device 17 inputs an input 105 (such as good), and respectively transmits the input 103 and the input 105 to each of the user devices 15, 17, 19 for display. Therefore, the purpose of remote interaction to reduce bandwidth load is achieved by objectifying the input.
[0034] See figure 2 , Is a schematic diagram of an embodiment of the remote interactive system of the present invention. As shown in the figure, the remote interactive system 20 includes a server host 11 and user devices 15, 17, and 19. Wherein, the server host 11 has a topology structure data table 111, an online state analysis module 113, and a shared hierarchy module 115. The user device 15 has a hierarchical module 151 and an objectization module 153. The user device 17 has a hierarchical module 171 and an objectization module 173. The user device 19 has a hierarchical module 191 and an objectization module 193.
[0035] The user devices 15, 17, and 19 can log in to the server host 11 via the network, and the server host 11 recognizes and allocates connections between the user devices 15, 17, and 19. The connections between the user devices 15, 17, and 19 can form a peer-to-peer (P2P) architecture, but it is not limited to this. The topology data table 111 of the server host 11 can record the link data of each user device 15, 17, 19 with each other, and the server host 11 can allocate the connections between the user devices 15, 17 and 19 according to the topology data table 111.
[0036] In an embodiment of the present invention, the shared layer module 115 of the server host 11 establishes a shared surface layer, which is a virtual layer layer, and can create a shared object. The shared object may be at least one of a static image, a dynamic image, a 3D object, and a virtual reality object, but is not limited to this.
[0037] The server host 11 can transmit the shared object to one or more designated user devices 15, 17, and 19. The user devices 15, 17, and 19 respectively execute an interactive program to create a virtual surface. The interactive program can be in the form of a web page or an application program, but is not limited to this. The virtual surface layer is a virtual layer, which can cover all the content of the interactive program, or as needed, only cover a specific area for interactive transmission. The virtual surface does not need to be displayed on the user devices 15, 17, 19 for the user to view.
[0038] The user can operate the user devices 15, 17, and 19 to input on their interactive programs. In this embodiment, the user device 15 is taken as an example for description. When the user operating the user device 15 operates the user device 15 to perform an input on its interactive program, the user device 15 can capture the input through its virtual surface and use the objectification module 153 to objectify the input. Wherein, this input can be a revision, a description, a mark or an operation, but is not limited to this. The input objectification process can be a vectorized action, but it is not limited to this. For example, the input image or trajectory can be vectorized. The vectorized object can have the advantages of storing the smallest file size, arbitrarily zooming without distortion and facilitating transmission, and can operate its movement, rotation, zooming and filling at will.
[0039] The user device 15 transmits the targeted input to one or more designated user devices 17 and 19. Among them, the user device 15 can directly transmit the objectified input to the designated user device 17 or 19 via a network connection, or can be transmitted via the server host 11. Among them, the transmission of object-oriented input conforms to the specifications of Extensible Markup Language (XML) and Scripting Language, but is not limited to this. The virtual surfaces of the user devices 15, 17, and 19 can respectively display the received shared objects and objectified input in their interactive programs in a superimposed manner, so as to display the shared objects and input for the user to view or operate.
[0040] See image 3 , Is the superimposed display schematic diagram of the embodiment of the remote interactive system of the present invention. And see figure 2 Wherein, the shared layer module 115 of the server host 11 creates a shared surface layer, which is a virtual layer layer, and a shared object 31 can be created. In this embodiment, the shared object 31 uses a house image as an example. The virtual surfaces of the user devices 15, 17, and 19 can display the received shared objects 31 in their interactive programs for users to watch or operate. The user can use the user devices 15, 17, 19 to perform actions such as revision, explanation, annotation, or operation to achieve the purpose of interactive discussion. For example, if the user operates the user device 15 to perform an input 35 (such as nice) on its interactive program, the user device 15 can capture the input 35 through its virtual surface and use the objectization module 153 to objectify the input 35. The user device 15 then transmits the targeted input 35 to the user devices 17 and 19. Another user operates the user device 17 to perform an input 33 (such as good) on its interactive program. The user device 17 can capture the input 33 through its virtual surface and use the objectization module 173 to objectify the input 33. The user device 17 then transmits the targeted input 33 to the user devices 15 and 19.
[0041] The virtual surfaces of the user devices 15, 17, and 19 can respectively display the received shared objects 31 and the objectified inputs 33 and 35 in their interactive programs in a superimposed manner. As shown in the display screen 37, the shared objects 31 are displayed respectively. , The input 35 of the user device 15 and the input 33 of the user device 17 are displayed on the shared object 31', input 33', and input 35' of the display screen 37 in an overlapping manner. Therefore, it achieves the purpose of remote interactive discussion and reduces the bandwidth load by objectifying the input.
[0042] See Figure 4 , Is a schematic diagram showing the superimposed display of 3D objects of the embodiment of the remote interactive system of the present invention. versus image 3 The difference in the illustrated embodiment is that the shared object and objectified input in this embodiment are 3D objects. The shared object 41 takes a 3D head object as an example. The user operates the user device 15 to perform an input 45 on the interactive program. The input 45 is an example of a 3D glasses object. Another user operates the user device 17 to perform an input 43 on its interactive program. The input 43 is an example of a 3D hair object. The virtual surfaces of the user devices 15, 17, and 19 respectively display the received shared object 41 and objectified inputs 43 and 45 in their interactive programs in a superimposed manner. As shown in the display screen 47, the shared object 41, the input 45 of the user device 15 and the input 43 of the user device 17 are respectively displayed on the shared object 41', input 43', and input 45' of the display screen 47 in a superimposed manner . The user can operate the shared object 41', input 43', or input 45' to perform an action 49, where the action 49 can be an action such as rotation, movement or zooming, but is not limited to this. To achieve remote interactive discussion of 3D objects or virtual reality, and to reduce bandwidth load with object input.
[0043] In an embodiment of the present invention, the shared surface layer can be created by one of the user devices 15, 17, 19, and then the shared object created by the shared surface layer is transmitted to one or more designated user devices 15, 17, 19, And achieve the purpose of remote interactive discussion.
[0044] In an embodiment of the present invention, the input is a revision, a description, an annotation, or an operation, and the operation can be a compound action of multiple instructions, such as rotation, movement, or zooming, but is not limited to this.
[0045] In an embodiment of the present invention, the objectified input can be attached to the shared object. When one of the user devices 15, 17, 19 operates the shared object to run an action, the objectified input can change with the action.
[0046] In an embodiment of the present invention, the objectified input can choose whether to attach to the shared object. If the targeted input selection depends on the shared object, when one of the user devices 15, 17, 19 operates the shared object to run an action, the targeted input can be changed with this action. If the targeted input selection does not depend on the shared object, when one of the user devices 15, 17, 19 operates the shared object to run an action, the targeted input does not change with this action.
[0047] In an embodiment of the present invention, when the user devices 15, 17, 19 log in to the server host 11 through the network, and the server host 11 identifies and allocates connections between the user devices 15, 17, and 19, the server host 11 can allocate the user device 15 , 17, 19 make it have different permissions, such as: management permissions or viewing permissions. Among them, user devices with viewing rights generally can only view shared objects and targeted input displayed on their interactive programs, and cannot perform input or operation. A user device with management authority can not only watch, but also input and operate. In addition, a user device with management authority can authorize a user device with viewing authority to perform input or operation authority, and a user device with viewing authority can perform input or operation only after obtaining authorization.
[0048] In an embodiment of the present invention, the connection status analysis module 113 of the server host 11 analyzes the connection status of each user device 15, 17, 19, when the connection status of each user device 15, 17, 19 is in a low load state At this time, the server host 11 controls the user devices 15, 17, 19 to execute in a first mode. The first mode is a mode in which the shared surface is established by the user device and directly transmits the objectified input to another user device. One of the user devices 15, 17, 19 establishes a shared surface, and the shared surface establishes a shared object. And transmit the shared object to one or more designated user devices. When one of the user devices 15, 17, and 19 inputs on its interactive program, the targeted input is directly transmitted to the designated user device. The virtual surfaces of the user devices 15, 17, and 19 display the received shared objects and objectified input in the interactive program in a superimposed manner.
[0049] When the connection status analysis module 113 analyzes the connection status of each user device 15, 17, 19, and when the connection status of one of the user devices is in a high load state, the server host 11 controls the user devices 15, 17, 19 to execute A second mode. The second mode is a mode in which the shared surface is established by the server host 11 and the targeted input is transmitted to another user device through the server host 11. The server host 11 creates a shared surface, and the shared surface creates a shared object. And send the shared object to one or more designated user devices 15, 17, 19, when one of the user devices 15, 17, 19 inputs on its interactive program, first send the objectified input to the server host 11 , The server host 11 then transmits the targeted input to one or more designated user devices 15, 17, 19. The virtual surfaces of the user devices 15, 17, and 19 then display the received shared objects and objectified input in an interactive program in a superimposed manner.
[0050] The connection status analysis module 113 may analyze at least one of the connection bandwidth and the number of connections of each user device 15, 17, 19 to determine the load level of the connection status, but it is not limited to this. Through the setting of the connection status analysis module 113, the remote interactive system 20 of the present invention can operate in different modes according to the connection status of the user device, so that the connection quality of the system is smoother and the bandwidth load of the server 11 can be reduced.
[0051] In an embodiment of the present invention, the user can control the user devices 15, 17, 19 through the server host 11 to execute in the first mode or the second mode.
[0052] See Figure 5 , Is a schematic diagram of the interaction process of each terminal in the embodiment of the remote interaction system of the present invention. And see figure 2 This flow chart shows the procedure of establishing a shared surface by the server host 11, inputting by the user devices 15, 17, and transmitting the targeted input to another user device through the server host 11.
[0053] As shown in the figure, the user devices 15 and 17 respectively log in (501, 502) the server host 11 through the network, and the server host 11 respectively recognizes (503, 504) and performs the allocation connection between the user devices 15 and 17 (505).
[0054] The shared layer module 115 of the server host 11 establishes a shared surface layer, which is a virtual layer layer and can create a shared object. And the shared objects (506, 507) are transmitted to the user devices 15, 17 respectively. When the user operates the user device 15 to perform an input on its interactive program, the user device 15 can capture the input through its virtual surface and use the objectification module 153 to objectify the input. The user device 15 can first transmit the objectified input (508) to the server host 11 through the network connection, and the server host 11 then transmits the objectified input (509) to the user device 17. When another user operates the user device 17 to make an input on the interactive program. The user device 17 can first transmit the objectified input (511) to the server host 11 through the network connection, and the server host 11 then transmits the objectified input (510) to the user device 15.
[0055] See Image 6 , Is a schematic diagram of the interaction process of each terminal in another embodiment of the remote interaction system of the present invention. And see figure 2. This flowchart shows a program in which the shared surface layer is established by the user device 15 and input by the user devices 15, 17, and then the objectified input is directly transmitted to another user device.
[0056] As shown in the figure, the user devices 15 and 17 respectively log in (601, 602) to the server host 11 through the network, and the server host 11 respectively recognizes (603, 604) and performs the allocation connection between the user devices 15 and 17 (605).
[0057] The hierarchical module 151 of the user device 15 establishes a shared surface, which is a virtual hierarchical level, and can create a shared object. And this shared object (606) is directly transmitted to the user device 17. When the user operates the user device 15 to perform an input on its interactive program, the user device 15 can capture the input through its virtual surface and use the objectification module 153 to objectify the input. The user device 15 can directly transmit the objectified input (607) to the user device 17 via a network connection. When another user operates the user device 17 to make an input on the interactive program. The user device 17 can directly transmit the objectified input (608) to the user device 15 via a network connection.
[0058] See Figure 7 , Is a schematic diagram of another embodiment of the remote interactive system of the present invention. As shown in the figure, the remote interactive system 70 includes a server host 11 and user devices 15, 17, and 19.
[0059] versus figure 2 The difference in the illustrated embodiment is that in this embodiment, the user devices 15, 17, and 19 belong to the same or different groups, and the shared objects and objectified input are only sent to one or more belonging to the same group. Designated user devices. As shown in the figure, the user devices 15 and 17 belong to a first group 71, and the user devices 15 and 19 belong to a second group 73.
[0060] The shared object or objectification input created by the user device 17 will only be transmitted to another user device 15 belonging to the same first group 71, but not to the user device 19 belonging to the second group 73. The shared object or objectification input created by the user device 19 will only be transmitted to another user device 15 belonging to the second group 73, but not to the user device 17 belonging to the first group 71. The shared object or objectification input created by the user device 15 will be sent to another user device 17 that also belongs to the first group 71 or another user device 19 that also belongs to the second group 73.
[0061] Through the group setting, different user devices can belong to the same or different groups to conduct remote interactive discussions with different objects, different contents or topics. For example, the user devices 15, 17 may be discussion groups between colleagues, and the user devices 15, 19 may be discussion groups with customers. Or the user devices 15 and 17 are discussion groups for movie topics, and the user devices 15 and 19 are discussion groups for decoration design, etc. To achieve the purpose of multiple discussions.
[0062] In an embodiment of the present invention, the user devices 15, 17, 19 that belong to the same group may have different permissions, such as hosting permissions or viewing permissions. Among them, user devices with viewing rights can generally only view and display on their interactive programs, shared objects and targeted input belonging to the same group, and cannot perform input or operation. The user device with the moderator authority can not only watch the shared objects and targeted input of the same group, but also input and operate. In addition, a user device with moderation authority can authorize a user device with viewing authority in the same group to perform input or operation authority, and a user device with viewing authority can input or operate only after obtaining authorization.
[0063] See Figure 8 , Is a flowchart of an embodiment of the remote interaction method of the present invention, and refer to figure 2. As shown in step S801, the user devices 15, 17, and 19 log in to the server host 11 respectively. Next, in step S803, the server host 11 identifies and allocates connections between the user devices 15, 17, and 19. In step S805, one of the server host 11 or the user devices 15, 17, 19 creates a shared surface. The shared surface is a virtual level, and a shared object can be created. And transmit the shared object to one or more designated user devices (step S807). Then, in step S809, the user devices 15, 17, and 19 respectively execute an interactive program to create a virtual surface. When one of the user devices 15, 17, and 19 inputs on its interactive program, the input is captured and objectified through its virtual surface (step S811). And transmit the targeted input to one or more designated user devices 15, 17, 19 (step S813). Next, in step S815, the virtual surface of the user devices 15, 17, and 19 displays the shared object and the received objectified input in its interactive program in a superimposed manner.
[0064] See Picture 9 , Is a flowchart of another embodiment of the remote interaction method of the present invention, and refer to figure 2. As shown in step S901, the user devices 15, 17, and 19 log in to the server host 11 respectively. Next, in step S902, the server host 11 identifies and allocates connections between the user devices 15, 17, and 19. In step S903, the online status analysis module 113 of the server host 11 analyzes the online status of each user device 15, 17, 19. If the connection states of the user devices 15, 17, and 19 are all in a low load state, step S904 is performed. If not, proceed to step S911. The connection status analysis module 113 can analyze at least one of the connection bandwidth and the number of connections of each user device 15, 17, 19 to determine the load level of the connection status, but it is not limited to this.
[0065] In step S904, the server host 11 controls the user device to execute in a first mode. The first mode is a mode in which the shared surface is established by the user device and directly transmits the objectified input to another user device. In step S905, one of the user devices 15, 17, and 19 establishes a shared surface. The shared surface is a virtual level, and a shared object can be created. And transmit the shared object to one or more designated user devices (step S906). When one of the user devices 15, 17, 19 inputs on its interactive program, the targeted input is directly transmitted to one or more designated user devices 15, 17, 19 (step S907). Next, in step S908, the virtual surfaces of the user devices 15, 17, and 19 display the received shared object and objectified input in the interactive program in a superimposed manner. Then, step S903 is repeated.
[0066] In step S911, the server host 11 controls the user devices 15, 17, 19 to execute in a second mode. The second mode is that the shared surface is established by the server host 11, and the targeted input is sent to another user device through the server host 11 Pattern. In step S912, the server host 11 establishes a shared surface. The shared surface is a virtual level, and a shared object can be created. And transmit the shared object to one or more designated user devices (step S913). When one of the user devices 15, 17, 19 performs input on its interactive program, the target input is first sent to the server host 11, and the server host 11 then sends the target input to one or more designated users Devices 15, 17, 19 (step S914). Then, step S908 is performed again, and the virtual surfaces of the user devices 15, 17, and 19 display the received shared object and objectified input in the interactive program in a superimposed manner. Then, step S903 is repeated.
[0067] Through the setting of the connection status analysis module 113, the remote interactive system 20 of the present invention can operate in different modes according to the connection status of the user device, so that the connection quality of the system is smoother and the bandwidth load of the server 11 can be reduced.
[0068] See Picture 10 , Is a flowchart of another embodiment of the remote interaction method of the present invention, and refer to Figure 7. As shown in step S101, the user devices 15, 17, and 19 respectively log in to the server host 11, wherein the user devices 15, 17, and 19 belong to the same or different groups, for example, the user devices 15, 17 belong to the same first group 71 , The user devices 15 and 19 belong to a second group 73. Next, in step S102, the server host 11 identifies and allocates connections between the user devices 15, 17, and 19. In step S103, one of the server host 11 or the user devices 15, 17, 19 establishes a shared surface. The shared surface is a virtual level, and a shared object can be created. And transmit the shared object to one or more designated user devices belonging to the same group (step S104). Then, in step S105, the user devices 15, 17, and 19 respectively execute an interactive program to create a virtual surface. When one of the user devices 15, 17, and 19 inputs on its interactive program, the input is captured and objectified through its virtual surface (step S106). And transmit the targeted input to one or more designated user devices belonging to the same group (step S107). Next, in step S108, the virtual surfaces of the user devices 15, 17, and 19 display the received shared object and objectified input in the interactive program in a superimposed manner.
[0069] Therefore, through the technology contained in the disclosure, the remote interaction method and system proposed by the present invention can achieve the purpose of reducing bandwidth load during remote real-time interaction, and can provide more diverse and interesting interactive content.
[0070] The foregoing descriptions are only preferred and feasible embodiments of the present invention, and all equivalent changes and modifications made in accordance with the scope of the patent application of the present invention shall fall within the scope of the present invention.

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.

Similar technology patents

Classification and recommendation of technical efficacy words

  • Reduce bandwidth load
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products