Patents
Literature
Hiro is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Hiro

320 results about "Virtual cinematography" patented technology

Virtual cinematography is the set of cinematographic techniques performed in a computer graphics environment. This includes a wide variety of subjects like photographing real objects, often with stereo or multi-camera setup, for the purpose of recreating them as three-dimensional objects and algorithms for automated creation of real and simulated camera angles.

Virtual fly over of complex tubular anatomical structures

An embodiment of the invention is method, which can be implemented in software, firmware, hardware, etc., for virtual fly over inspection of complex anatomical tubular structures. In a preferred embodiment, the method is implemented in software, and the software reconstructs the tubular anatomical structure from a binary imaging data that is originally acquired from computer aided tomography scan or comparable biological imaging system. The software of the invention splits the entire tubular anatomy into exactly two halves. The software assigns a virtual camera to each half to perform fly-over navigation. Through controlling the elevation of the virtual camera, there is no restriction on its field of view (FOV) angle, which can be greater than 90 degrees, for example. The camera viewing volume is perpendicular to each half of the tubular anatomical structure, so potential structures of interest, e.g., polyps hidden behind haustral folds in a colon are easily found. The orientation of the splitting surface is controllable, the navigation can be repeated at another or a plurality of another split orientations. This avoids the possibility that a structure of interest, e.g., a polyp that is divided between the two halves of the anatomical structure in a first fly over is missed during a virtual inspection. Preferred embodiment software conducts virtual colonoscopy fly over. Experimental virtual fly over colonoscopy software of the invention that performed virtual fly over on 15 clinical datasets demonstrated average surface visibility coverage is 99.59+ / −0.2%.
Owner:UNIV OF LOUISVILLE RES FOUND INC

Real-time virtual scene LED shooting system and method

The invention discloses a real-time virtual scene LED shooting system and method, and belongs to the field of movie and television shooting. According to the invention, the digital assets are called to construct the virtual scene according to the content needing to be presented by shooting scenes, the virtual LED screen and the virtual camera are reconstructed in a virtual engine module, and the real environment illumination information in the photostudio is synchronized to the virtual engine in real time. Distributed real-time rendering is performed on the virtual scene through the virtual engine module and the virtual scene is displayed on a virtual LED screen; the virtual engine further overlaps a picture with a depth channel rendered by the virtual engine in real time according to theposition of the real-time camera and lens distortion information outside the virtual LED screen. The physical LED screen displays the virtual LED screen, the real-time camera completes shooting, the XR module obtains the picture with the depth channel and the picture shot by the real-time camera, and a final picture is obtained through synthesis. The method can replace green screen key to achievethe effect of direct film formation in most environments, optimize the film production process, and save the cost of complex visual special effects.
Owner:浙江时光坐标科技股份有限公司

Control method and device of virtual camera in virtual studio, implementation method of virtual studio, virtual studio system, computer readable storage medium and electronic equipment

The invention relates to the technical field of computers, and provides a control method and device of a virtual camera in a virtual studio, an implementation method of the virtual studio, a virtual studio system, a computer readable storage medium and electronic equipment. The control method of the virtual camera in the virtual studio comprises the steps of: obtainingpreset parameters of the virtual camera in different preset studio links, whereinthe preset parameters comprise at least one of the position, the posture and the focal length; generating a corresponding mirror moving control according to each preset parameter; and in response to a triggering operation on any lens moving control, adjusting the virtual camera according to a preset parameter corresponding to any lens moving control. According to the scheme, based on the generated lens moving control, the virtual camera can shoot the virtual background pictures corresponding to different preset performance broadcasting links,so that the virtual background pictures and the actual pictures shot by the entity camera can be better fused, and the reality sense of the performance broadcasting pictures is improved.
Owner:NETEASE (HANGZHOU) NETWORK CO LTD

Hand-drawn scene three-dimensional modeling method combining multi-perspective projection with three-dimensional registration

The invention provides a hand-drawn scene three-dimensional modeling method combining multi-perspective projection with three-dimensional registration. The three-dimensional modeling method comprises steps that standardized preprocessing is performed on all three-dimensional models in a three-dimensional model base, virtual cameras are arranged at vertexes of a regular polyhedron, projection pictures at all angles of each three-dimensional model are shot to represent visual shapes of the three-dimensional model, visual features of all the projection pictures of each three-dimensional model are extracted, and a three-dimensional model feature base is established according to the visual features; users draw two-dimensional hand-drawn pictures of each three-dimensional model of a three-dimensional scene needing showing and character labels of the two-dimensional hand-drawn drawings by hands, images are shot through cameras, processing on image regions is performed, visual features of hand-drawn pictures are extracted, character label regions subjected to processing serve as retrieval key words, similarity calculation is performed on visual features of hand-drawn pictures and three-dimensional model features of the three-dimensional model feature base, retrieval is performed to obtain three-dimensional models of a three-dimensional scene, three-dimensional models with largest similarity are projected to corresponding positions through a three-dimensional registration algorithm, and then show of three-dimensional modeling of the hand-drawn scene and a three-dimensional is achieved.
Owner:BEIJING UNIV OF POSTS & TELECOMM

Unmanned aerial vehicle positioning method based on a cooperative two-dimensional code of a virtual simulation environment

ActiveCN109658461ASolving Fast, Robust Localization ProblemsUnable to solve the problemImage enhancementImage analysisUncrewed vehicleEngineering
The invention provides an unmanned aerial vehicle positioning method based on a cooperative two-dimensional code of a virtual simulation environment, and the method comprises the steps: placing a checkerboard in a virtual scene, carrying out the calibration of a camera, and obtaining the parameters of the virtual camera; Identifying the AprilTag two-dimensional code in the scene, accurately positioning the unmanned aerial vehicle through the AprilTag two-dimensional code, and verifying the calibration accuracy of the camera and the feasibility of the positioning and attitude determination algorithm based on the AprilTag two-dimensional code in the virtual scene. In the virtual scene, the present invention places a checkerboard grid, uses a coordinate system conversion relationship to obtain virtual camera parameters, and calibrates the camera, and provides a camera internal reference for the drone visual navigation verification algorithm in the virtual scene. the problem that the virtual camera internal parameter cannot be acquired is solved, the calibrated camera parameters and the AprilTag two-dimensional code positioning algorithm are used for solving the position parameters ofthe camera, and the problem of rapid and robust positioning of the unmanned aerial vehicle in a complex environment is solved.
Owner:NO 20 RES INST OF CHINA ELECTRONICS TECH GRP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products