A virtual assembly method based on augmented reality and mobile interaction

A mobile interaction and virtual assembly technology, applied in the field of virtual assembly, can solve the problems of parts damage, limited observation range, inconvenient observation, etc., and achieve the effect of expanding the operating space, expanding the field of vision, and expanding the operating space

Active Publication Date: 2019-03-26
SOUTH CHINA UNIV OF TECH
2 Cites 5 Cited by

AI-Extracted Technical Summary

Problems solved by technology

Part of the current virtual assembly technology is realized by using computer simulation technology. The virtual assembly realized by this method is not convenient for the operator to observe the assembled parts. The angle of observation is fixed and the scope of observation is limited. Therefore, in the assembly process problems may arise in
The other part is realized based on virtual reality, but the virtual...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Method used

S3, remove data noise, and estimate the position and direction of hand, use Interval Kalman Filter (IKF) and Particle Filter (PF) algorithm to carry out denoising to the data that Leap Motion obtains, make gesture data more accurate, estimate simultaneously The position and direction of the shot.
The joint data of the personal body that virtual mapping module obtains by Kinect locates the position of people, and then utilizes AR technology to project virtual assembly parts model on the operating platform in front of people. The operating platform dynamically tracks the operator, and the coil of the operating platform will be continuously adjusted throughout the virtual assembly process so that the center of the magnet array of the human hand and the central axis of the coil are in a straight line, ensuring that the operator can feel accurate force feedback at any time. At the same time, the operating space of the operator's hand is greatly expanded, and the operator no longer has to worry about operation failure caused by moving the hand out of the operating space, which improves the operator's sense of immersion in the virtual assembly process. The movement of the robot ensures the possibility of mobile interaction in the whole method. The operator can move at will. Whenever the operator moves, the robot will automatically perform path planning to re-project the virtual object in front of the operator.
The whole virtual assembly system that realizes this example is mainly divided into five modules of gest...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Abstract

The invention provides a virtual assembly method based on augmented reality and mobile interaction. This method uses Leap Motion to obtain the hand gesture data, and projects it into the virtual environment to interact with the virtual model. IKF and IPF are used to estimate the position and direction of the hand. The position of human body is located by the joint data of human body acquired by Kinect, and the virtual 3D assembly robot and the model of assembly parts are projected in front of human body by AR technology. Magnetic force feedback uses the electromagnetic force to make the operator feel the interaction force between the parts in the assembly process. The present invention utilizes AR technology to allow an operator to view a part without dead angle of 360 degrees, The immersion feeling in the virtual assembly process is improved, and the electromagnetic force feedback makes the operator feel the interaction between the parts more clearly, so that the fitness between the parts can be known, the situation of the real assembly is closer, and the operation is more natural and convenient.

Application Domain

Technology Topic

Interaction forcesHuman body +5

Image

  • A virtual assembly method based on augmented reality and mobile interaction
  • A virtual assembly method based on augmented reality and mobile interaction
  • A virtual assembly method based on augmented reality and mobile interaction

Examples

  • Experimental program(1)

Example Embodiment

[0025] The specific implementation of the present invention will be further described below with reference to the accompanying drawings and examples, but the implementation and protection of the present invention are not limited to this. It should be pointed out that if there are processes (such as interval Kalman filtering and particle filtering algorithms) that are not specifically described below, those skilled in the art can understand or implement them with reference to the prior art.
[0026] There are currently some virtual assembly methods, but there are still some defects that deserve improvement. Part of the current virtual assembly technology is realized by computer simulation technology. The virtual assembly realized by this method is not convenient for the operator to observe the assembled parts. The observation angle is fixed and the scope of observation is also limited. Therefore, in the assembly process There may be problems in it. The other part is based on virtual reality. However, the user's sense of immersion is insufficient for virtual assembly in virtual reality. At the same time, the above two technologies have another problem. In the virtual assembly process, the user cannot feel the interaction force between the assembly parts. Therefore, there may be a problem in the assembly process that the user cannot perceive. The actual assembly will cause certain damage to the parts.
[0027] In order to solve the above-mentioned problems in the prior art, this embodiment provides a virtual assembly method based on augmented reality and mobile interaction. Leap Motion obtains human gesture data, and then uses AR technology to assemble the corresponding assembly robots and parts. 3D modeling and projecting to the real world allows the operator to control the robot and assemble parts by hand, and use electromagnets to feedback the interaction force between the parts during the assembly process, so that the operator can feel and do it in time Out adjustment. The method mainly includes the following parts:
[0028] S1. Acquire the position of the human body and track it in real time, and use Kinect to identify human joints for positioning and tracking the operator, so that the operating platform can move with the movement of the operator's hand during the entire operation.
[0029] S2. Obtain gesture data, and use Leap Motion to obtain the operator's gesture data.
[0030] S3. Remove data noise and estimate the position and direction of the hand. Use Interval Kalman Filter (IKF) and Particle Filter (PF) algorithms to denoise the data obtained by Leap Motion, making the gesture data more accurate and at the same time estimating the position of the hand And direction.
[0031] S4. Perform collision detection so that human hands in the real environment can directly affect the parts in the virtual environment, so that the operator can directly interact with the virtual parts with bare hands, realizing the direct bare-hand assembly work.
[0032] S5. Use electromagnets for force feedback, use the electromagnets on the operating platform and the magnet array on the human hand to control the current through closed-loop control to achieve force feedback that simulates real assembly conditions.
[0033] The entire virtual assembly system to implement this example is mainly divided into five modules: gesture capture, jitter elimination, virtual mapping, collision detection and magnetic feedback. The gesture capture module uses two Leap Motions to obtain the gesture data of the human hand. By capturing the joints and palms of the hand, a 3D model of the hand skeleton is established, which is convenient for projecting it into the virtual environment to interact with the virtual model. The jitter elimination module uses Interval Kalman Filter (IKF) and Particle Filter (PF) to estimate the position and direction of the hand, thereby eliminating the noise generated during the sensor acquisition process.
[0034] The virtual mapping module uses the joint data of the human body obtained by Kinect to locate the position of the person, and then uses AR technology to project the virtual assembly part model onto the operating platform in front of the person. The operating platform dynamically tracks the operator, and the coil of the operating platform is continuously adjusted during the entire virtual assembly process so that the center of the magnet array of the human hand is in a straight line with the central axis of the coil, ensuring that the operator can feel accurate force feedback at any time. At the same time, the operating space of the operator's hand is greatly expanded, and the operator can no longer worry about the operation failure caused by moving the hand out of the operating space, which improves the operator's immersion in the virtual assembly process. The movement of the robot ensures that the whole method can realize the possibility of mobile interaction. The operator can move at will. Whenever the operator moves, the robot will automatically plan the path to re-project the virtual object in front of the operator.
[0035] The collision detection module first performs collision detection between the model built according to the human hand and the part that the operator intends to operate, so as to determine whether there is an interaction and realize the interaction between the real human hand and the virtual part. When a collision occurs, it means that the operator is doing actions such as grabbing the part. At the same time, the module is also responsible for detecting collisions between two parts during the assembly process. An artificial potential field method is used here to achieve an assistance. It is assumed that the entire operating space is a potential field, a composite potential field formed by the superposition of two potential fields. The two potential fields are the gravitational potential field and the repulsive potential field. The target state and position to be reached by the part is a gravitational point. The state that will cause damage to the part or cannot be assembled during assembly is the repulsive point. The gravitational field can be expressed as :
[0036]
[0037] q is the state of the object, q tar It is the target state that you want to reach, that is, the set gravitational point, ξ is the gravitational factor, and ρ is the function to find the distance between the two states. Gravity is the derivative of the gravitational field function:
[0038]
[0039] The repulsion field generated by the repulsion point can be expressed as:
[0040]
[0041] η is the corresponding repulsion factor, q unexp It is the state that will cause damage to the parts, and is also the set repulsion point, ρ 0 It is the radius of the range that the repulsion point can influence. When the part is too far away from the repulsion point, the influence of the repulsion point will be zero. Similarly, the repulsion force generated by the repulsion point on the object is also the derivative of the repulsion field function:
[0042]
[0043] Magnetic force feedback uses electromagnetic force to let the operator feel the interaction force between parts during the assembly process. The use of AR technology allows the operator to observe the parts in 360 degrees without blind angles, which improves the immersion in the virtual assembly process. At the same time, the electromagnetic force feedback allows the operator to more clearly feel the interaction between the parts, so that they can know the interaction between the parts. The adaptation situation is closer to the real assembly situation, and the operation is more natural and convenient. In the experiment, an array magnet was used instead of a monolithic magnet. The monolithic magnet will hinder the movement of the operator's hand, changing to a magnet array will not have any influence on the movement of the operator's hand, and will not affect the effect of force feedback.
[0044] The following is a description of the drawings. First, use Kinect to track the human body so that the operating platform moves with the movement of the human body. After that, Leap Motion is used to obtain the operator's gesture data, and then the Interval Kalman Filter (IKF) and Particle Filter (PF) algorithms are used to denoise the data obtained by Leap Motion to estimate the position and direction of the hand. After acquiring the gesture data, AR technology is used to model the hand bones and parts in 3D. Projecting the obtained virtual model to the real world, the operator can use the collision detection technology to control and assemble the parts in the virtual environment with bare hands. Afterwards, during the assembly process, the collision detection function again judges the interaction between the parts, and feedbacks the interaction force between the parts through the electromagnet, so that the operator can feel and make adjustments in time.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Similar technology patents

Classification and recommendation of technical efficacy words

  • Expand field of view
  • Large operating space

Wheel, leg, bounce mechanism combined mobile robot

InactiveCN1994804AImprove the ability to overcome obstaclesExpand field of viewSelf-moving toy figuresVehiclesCcd cameraMobile robot
Owner:ZHEJIANG UNIV

Large-size multi-axis-linkage electrolytic machine tool

ActiveCN103464845ALarge operating spaceEasy to install and adjustMachining working media supply/regenerationElectric circuitsMachine toolPulse power supply
Owner:NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products