Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

63 results about "Simultaneous localisation and mapping" patented technology

Autonomous gardening vehicle with camera

ActiveCN104714547AExpected to operate autonomouslyMowersWatering devicesTerrainDrive wheel
Method for generating scaled terrain information with an unmanned autonomous gardening vehicle (1), the gardening vehicle (1) comprising a driving unit comprising a set of at least one drive wheel (5) and a motor connected to the at least one drive wheel for providing movability of the gardening vehicle (1), a gardening-tool (7) and a camera (10a-b) for capturing images of a terrain, the camera (10a-b) being positioned and aligned in known manner relative to the gardening vehicle (1). In context of the method the gardening vehicle (1) is moved in the terrain whilst concurrently generating a set of image data by capturing an image series of terrain sections so that at least two (successive) images of the image series cover an amount of identical points in the terrain, wherein the terrain sections are defined by a viewing area of the camera (10a-b) at respective positions of the camera while moving. Furthermore, a simultaneous localisation and mapping (SLAM) algorithm is applied to the set of image data and thereby terrain data is derived, the terrain data comprising a point cloud representing the captured terrain and position data relating to a relative position of the gardening vehicle (1) in the terrain. Additionally, the point cloud is scaled by applying absolute scale information to the terrain data, particularly wherein the position data is scaled.
Owner:HEXAGON TECH CENT GMBH

Radar-based system and method for real-time simultaneous localization and mapping

A method for performing Simultaneous Localization And Mapping (SLAM) of the surroundings of an autonomously controlled moving platform (such as UAV or a vehicle), using radar signals, comprising the following steps: receiving samples of the received IF radar signals, from the DSP; receiving previous map from memory; receiving data regarding motions parameters of the moving platform from an Inertial Navigation System (INS) module, containing MEMS sensors data; grouping points to bodies using a clustering process; merging bodies that are marked by the clustering process as separate bodies, using prior knowledge; for each body, creating a local grid map around the body with a mass function per entry of the grid map; matching between bodies from previous map and the new bodies; calculating the assumed new location of the moving platform for each on the mL particles using previous frame results and new INS data; for each calculated new location of the mL particles with normal distribution, sampling N assumed locations; for each body and each body particle from the previous map and for each location particle, calculating the velocity vector and orientation of the body between previous and current map, with respect to the environment using the body velocity calculated in previous step and sampling N2 particles of the body, or using image registration to create one particle where each particle contains the location and orientation of the body in the new map; for each particle from the collection of particles of the body: propagating the body grid according the chosen particle; calculating the Conflict of the new observed body and propagated body grid, using a fusion function on the parts of the grid that are matched; calculating Extra Conflict as a sum of the mass of occupied new and old grid cells that do not have a match between the grids; calculating particle weight as an inverse weight of the combination between Conflict and Extra conflict; for each N1 location particle, calculating the weight as the sum of the best weight per body for that particle; resampling mL particles for locations according to the location weight; for each body, choosing mB particles from the particles with location in one of the mL particles for the chosen location, and according to the particles weights; for each body and each one of the mB particles calculate the body velocity using motion model; and creating map for next step, with all the chosen particles and the mass function of the grid around each body, for each body particle according to the fusion function.
Owner:ARBE ROBOTICS LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products