Patents
Literature
Hiro is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Hiro

98 results about "3D interaction" patented technology

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

Unmanned aerial vehicle three-dimensional display control comprehensive-training system

The invention discloses an unmanned aerial vehicle three-dimensional display control comprehensive-training system and relates to the technical field of training simulation technologies. The unmanned aerial vehicle three-dimensional display control comprehensive-training system comprises a flight simulation module, a visual simulation module, a flight control interaction module, a maintenance operation training module and an autonomic learning examination module, adopts a driving type three-dimensional curved-surface displayer to establish a display platform, adopts a 3D interaction technology and a three-dimensional driving technology to achieve driving and three-dimensional display of three-dimensional images, achieves highly-simulated man-machine interaction through interaction hardware, establishes a multiple-sensory-simulation ground control environment and adopts a flight control training working mode, a maintenance operation training working mode and an autonomic learning examination working mode. By means of the system, an unmanned aerial vehicle operator can master an unmanned aerial vehicle control method, an operation process and a special situation disposal method, accurately senses the state and environment situation of the three-dimensional space of an aerial vehicle, and a maintainer can be quickly familiar with the principle, structure and functions of the aerial vehicle and master a guarantee process and a maintenance method.
Owner:JINLIN MEDICAL COLLEGE

Interactive product configuration platform

An interactive product configuration platform comprises a two-dimensional (2D) or three-dimensional (3D) information interface module, a product case base module, a parts base module, a product and parts management module, an optimal solver module, an embedded 3D product holographic information interaction module and a parameter drive modeling module. The 2d or 3D information interface module is relevant to a general bill of material (GBOM), and used for obtaining of user demands and generating of a bill of material (BOM). The product case base module and the parts base module have online self-learning ability. The product and parts management module is based on cloud computing. The optimal solver module is relevant to product structures and a case base, and a quadratic parabola fuzzy set case-based reasoning method is adopted for the optimal solver module which is used for matching and solving of an optimal case which is in the case base and most similar to a customized product. The embedded 3D product holographic information interactive module is relevant to the case base, and used for previewing of holographic cases of optimal products by a user and interactive information modifying. The parameter drive modeling module is relevant to the optimal cases, and used for establishing of a 3D model of the customized product driven by automatic parametric characteristics according to the user demands and interactive information. The interactive product configuration platform achieves 3D interaction, and improves product development and design efficiency.
Owner:ZHEJIANG UNIV OF TECH

3D interaction-type projection lens

The invention provides a projection lens. The projection lens comprises a first lens set, a second lens set, a third lens set and a fourth lens set from an imaging side to an image source side in sequence, wherein the first lens set comprises a first lens with negative focal power and a reflection optical face enabling a light path to bend; the second lens set comprises a second lens with positive focal power, and the faces, facing the imaging side, of the second lens and the image source side, of the second lens are convex faces; the third lens set comprises a third lens with negative focal power; the fourth lens set has positive focal power and comprises one or more lenses with focal power, and the face, closest to the imaging side, of the fourth lens set is a convex face. The lens meets the following relation of ImgH/D>0.55, wherein ImgH is one half the diameter of an image source, and D is the vertical distance between the face, facing the imaging side, of the first lens and the center position of the reflection optical face. According to the projection lens, because the four lens sets are adopted, the size of a lens system can be effectively reduced, it can be guaranteed that the lens has a large resolution ratio under the large-view-angle condition, and the large-view-field-angle, small-distortion and high-resolution-ratio technical effects are achieved.
Owner:ZHEJIANG SUNNY OPTICAL CO LTD

Virtual reality system for experiencing fire-fighting scene

The invention discloses a virtual reality system for experiencing a fire-fighting scene. The virtual reality system comprises a fire-fighting virtual training interaction module, a fire-fighting training scene virtual electric control module and a practice training cabin, wherein the fire-fighting virtual training interaction module utilizes a 3D interaction technique to establish an entity high-precision model in high compatibility; the fire-fighting training scene virtual electric control module utilizes the combination of an environment control module and a digital-to-analogue conversion device to seamlessly connect the reaction behaviors of trainees in the virtual fire scene with the virtual equipment and scene objects in the fire scene; a special annular visual angle high-reflection developing carrier with a 5M diameter is used as an outer layer of the practice training cabin; and a 360-degree multi-channel curved fusing projection system which is composed of five sets of high-luminance image projection devices is arranged above the developing carrier. The virtual reality system does not consume the true fire-fighting equipment in the maneuver, does not pollute the environment and can guarantee the daily fire-fighting maneuver and skilled rescue skills of a fire-fighting commander.
Owner:SHANGHAI FIRE RES INST OF THE MIN OF PUBLIC SECURITY +1

Medical image workstation with stereo visual display

The invention discloses a medical image workstation with stereo visual display. The medical image workstation comprises a computer mainframe, a three dimensional (3D) display screen, an input device and a software system. The software system comprises a software function module, a stereo visual display module, a 3D interaction module and a 2D display and stereo visual display linkage module. The software function module, the stereo visual display module, the 3D interaction module and the 2D display and stereo visual display linkage module are possessed by an ordinary medical image workstation. The stereo visual display module is used for achieving the stereo visual display with depth information. The 3D interaction module is used for receiving operating commands which is provided with the depth information and inputted by a user from the input device, and synchronously displaying operating results in real time on the 3D display screen. The 2D display and stereo visual display linkage module is used for achieving linkage of a 2D display screen and the 3D display screen or achieving conversion between the 2D display and the 3D stereo visual display. The medical image workstation with the stereo visual display can overcome the defect that an existing medical image workstation is poor in linkage of stereo visual display, 3D interaction, and 2D display and stereo visual display, so that observation effects and work efficiency of doctors can be improved.
Owner:湖北得康科技有限公司

Method and device of rendering 3D (three dimensional) model in any orientation

InactiveCN103810746AGood conditionImprove the speed of determining the orientation of 3D models in real time3D-image renderingImaging processingComputer graphics (images)
The invention is applicable to the field of image processing, and provides with a method and a device of rendering a 3D (three dimensional) model in any orientation. The method comprises the steps of obtaining the primary position and the primary direction vector of the last moment of the 3D model and the target position and the target direction vector of the 3D model through a 3D interactive tool; calculating the rotary matrix of the 3D matrix through the primary direction vector and the target direction vector, and calculating the translation matrix of the 3D model through the primary position and the target position; calculating the control matrix of the 3D model through the rotary matrix and the translation matrix; substituting the primary 3D data point information of the 3D model into the control matrix to obtain the target coordinates and target directions of all points of the 3D model, and rendering the target 3D model. Any 3D model which accords with a control requirement can be rendered, so that the method is applicable to the real-time rendering of any 3D model, the speed of determining the position of the 3D model can be improved in real time, and the state of controlling the 3D model can be effectively improved.
Owner:TCL CORPORATION

Interactive editing and stitching method and device for garment cut pieces

The invention relates to the field of garment design, discloses a 3D interactive editing and stitching method for garment CAD cut pieces, and aims to solve the problem that in the garment design, according to an emulation effect of a designed garment on a human body model, the garment needs to be repeatedly modified and redesigned. The method comprises the steps of: loading a DXF file in a garmentCAD design application and a three-dimensional human body model; extracting feature data of the garment cut pieces in the DXF file, wherein the feature data comprises information of stitching edges for forming the garment cut pieces; by an interactive tool, placing the garment cut pieces around corresponding positions of the three-dimensional human body model, and according to the information ofthe stitching edges, determining stitching edge pairs between the garment cut pieces, wherein an interactive operation is a movement operation of the garment cut pieces, which is carried out by utilizing the interactive tool; and by utilizing a stitching relationship between the stitching edge pairs of each garment cut piece, stitching each garment cut piece to generate a stitched three-dimensional garment model. Synchronous execution of garment design and emulation is implemented, and the designed garment is emulated in real time, so that garment design efficiency is improved.
Owner:INST OF AUTOMATION CHINESE ACAD OF SCI

3D (three dimensional) hologram Internet of Vehicles interactive display terminal

The invention provides a 3D (three dimensional) hologram Internet of Vehicles interactive display terminal, which comprises a vehicle-mounted diagnostic system, a cloud platform and a 3D interactive hologram in-vehicle navigation system, wherein the vehicle-mounted diagnostic system is installed in a vehicle, and is used for collecting vehicle state data and acquiring vehicle diagnostic data; the cloud platform is connected with a first wireless communication module, and is used for receiving and storing the vehicle state data and the vehicle diagnostic data; the 3D interactive hologram in-vehicle navigation system is connected with the vehicle-mounted diagnostic system and the cloud platform and is positioned in the vehicle. The 3D interactive hologram in-vehicle navigation system comprises a second wireless communication module, an interactive system, projection equipment, a plurality of image collecting modules and voice collecting modules. By adopting the 3D hologram Internet of Vehicles interactive display terminal, vehicle and map data interface can be projected into the space of the vehicle in a 3D image mode, so that a driver does not need to move eyesight away from a road and can observe the data interface in real time, and the driving safety is improved.
Owner:王占奎

VR (virtual reality) equipment based VR interactive simulation practical training system

A VR equipment based VR interactive simulation practical training system comprises VR interaction equipment, a PC for constructing a simulation practical training module, and a host body for data conversion and transmission. The VR interaction equipment comprises a head display device and a sensor module. A practical training method of the system comprises the steps that 1) the PC is combined witha 3D interaction function in the VR technology to construct virtual practical training modules, and a practitioner selects the corresponding virtual practical training module which needs simulated practical training; 2) the practitioner completes wearing of the VR interaction equipment, and the PC starts the corresponding virtual practical training module; and 3) the practitioner obtains positionand condition in a virtual application scene via the head display device, and completes the corresponding operation via the sensor module according to a corresponding practical training operation. Different virtual practical training modules are constructed to improve the convenience and quality of simulated teaching, and the system is suitable for use of teaching exercise and simulation practical training.
Owner:浙江引力波网络科技有限公司

Immersed naked-eye 3D interactive method and system based on simulation teaching training room

InactiveCN107783306AAvoid inauthenticAvoid the disadvantage of not being able to interact with naked eyes in 3DInput/output for user-computer interactionDiffusing elementsThird partyInteraction control
The invention discloses an immersed naked-eye 3D interactive method and system based on a simulation teaching training room and relates to the field of virtual simulation teaching training rooms. Themethod comprises the steps as follows: 1) splitting a planar 3D scene in a rendering engine to obtain a T-type scene or cross-type scene; 2) performing fusion and cluster rendering on the T-type sceneor cross-type scene by use of a rendering host and the rendering engine to obtain a synchronous rendering picture, and connecting the synchronous rendering picture with a display device to form a four-side or five side synchronous display projection wall; 3) completing real-time interaction control of the picture by use of a control engine on the basis of the step 2. The system comprises the rendering host, the rendering engine, the display device, a router and the control engine. The problem of poor training effect due to the fact that the third party watch virtual pictures by the aid of auxiliary equipment through picture synthesis and cannot perform immersed interaction with the virtual picture with naked eyes in the existing simulation teaching training field is solved, and immersed naked-eye 3D interaction effect is realized.
Owner:河南新汉普影视技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products