Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

444results about How to "Reduce query time" patented technology

Generation and search method for reachability chain list of directed graph in parallel environment

The invention belongs to the field of data processing for large graphs and relates to a generation and search method for reachability chain list of a directed graph in the parallel environment. The method includes distributing the directed graph to every processor which stores nodes in the graph and sub-nodes corresponding to the nodes; compressing graph data split to the processors; calculating a backbone node reachability code of a backbone graph; building a chain index; building a skip list on the chain index; allowing data communication among the processors; allowing each processor to send skip list information to other processors; allowing each processor to upgrade own skip list information; and building a reachability index of a total graph. Through use of graph reachability compression technology in the parallel environment, the size of graph data is greatly reduced, system computing load is reduced, and a system can process the graph data on a larger scale. The method has the advantages that the speed of reading data from a disk is higher, search speed is indirectly increased, accuracy of search results is guaranteed, and network communication cost and search time are reduced greatly for a parallel computing system during searching.
Owner:NORTHEASTERN UNIV

System and method for matchmaking and transaction of electronic commerce

The invention discloses a system and method for matchmaking and transaction of electronic commerce. The system comprises a user management module, a target management module, a transaction management module and an intermediary module, wherein an intermediary user provides commodities, services or user recommendation service for a buyer or a seller through the intermediary module; the commodities and services needing to be purchased or user parameter information provided by the buyer and the commodities, services or the user parameter information provided by the seller are received and stored into a database by the intermediary module; the commodities, services or user parameter information needed by the buyer and the commodities, services or user parameter information needed by the seller are matched, and the matching coincidence rate is calculated; and a group of commodities, services or user data or multiple groups of commodities, services or user data are selected for the two parties of the buyer and the seller to select according to the matching coincidence rate. According to the system and the method, the intermediary user who has deep knowledge of certain category products can effectively help the two trading parties reduce information query time and cost; and the success rate of on-line transaction is greatly improved due to the appearance of the intermediary user.
Owner:陈晓亮

Secondary surveillance radar track extraction method for multimode polling and S-mold roll-calling interrogation

InactiveCN103076605ASolve technical problems that are difficult to removeImprove accuracyRadio wave reradiation/reflectionSecondary surveillance radarTime space
The invention provides a secondary surveillance radar track extraction method for multimode polling and S-mold roll-calling interrogation. By using the method, false targets under an interference environment can be effectively removed and the accuracy and the real-time performance of the S-mode query are improved. The method is realized through the technical scheme that plot combination is conducted to S-mode plot data which is searched from the same sector of a secondary surveillance radar interrogator, the mean value and the variance of the plot data are calculated, and a real track is initiated after time-space registration; the combined plot data is divided according to the sector characteristic of radar and an associated sector window is created to enter a plot and track processing flow; firstly de-biased measurement conversion is conducted to the associated plot data, a CV (Constant Velocity) model, a CA (Constant Acceleration) model and a current statistical model are combined into a target multi-model, and the interactive input state estimation vector and the variance of each filter under an interactive effect are calculated; interference targets are removed by adopting interference target removing strategies; and unnecessary target files are removed through track management, tracking is ended and a correct target track extraction result of the secondary surveillance radar is given.
Owner:10TH RES INST OF CETC

Electronic police background intelligent management and automatic implementation system

An electronic policeman background intelligent management and automatic implementing system includes: a data processing working flow module and a data discovering module which are used for transmitting a data document from a data acquisition front end device to the background system which carries out automatic license plate identification, inquires about vehicle information and generates a traffic ticket; the data discovering module is used for inquiring, clearing up and statistics on the data, gives alarms to the vehicle owners breaking the law for a plurality of times and reasonably arranges the constabulary duties. The electronic policeman background intelligent management and automatic implementing system of the invention realizes automatization and intellectualization as well as builds a real time, accurate and effective transportation integrated management system which affects in a large scale in full aspects by carrying out coordination processing on the invention by dint of various advanced devices and technologies; thereby ensuring the traffic safety, reducing the working intensity of the policemen, releasing a large number of policemen used for processing emergencies and ensuring the safeties of human lives and properties, bringing benefits to traffic, society, economy, population, environment and technology and making contributions for building a harmonious society.
Owner:BEIHANG UNIV

Artificial intelligence-based searching error correction method and apparatus

The invention provides an artificial intelligence-based searching error correction method and apparatus. The method comprises the steps of receiving a first query statement input by a user, and determining whether the first query statement satisfies an error correction condition or not according to a preset error correction strategy; if the first query statement satisfies the error correction condition, determining to-be-error-corrected first word segmentations from the first query statement; obtaining respective first candidate results corresponding to the first word segmentations according to a preset candidate recall strategy; determining an error correction result corresponding to the first word segmentations according to quality characteristic values of the respective first candidate results; and performing error correction on the first query statement based on the error correction result to generate a second query statement. By adoption of the artificial intelligence-based searching error correction method, whether the query needs to be subjected to error correction or not is accurately determined based on historical data, and the error correction candidate results are accurately screened to determine the error correction result, so that the error correction efficiency and accuracy of a search engine are improved, inquiring time of the user is reduced, and user experience is improved.
Owner:BEIJING BAIDU NETCOM SCI & TECH CO LTD

Information associating method based on user operation record and resource content

The invention relates to an information associating method based on a user operation record and resource content. The method comprises the following steps: firstly, automatically excavating a task model (of a user) based on the operation record and a subject model based on the resource content according to the operation history record and the relevant resource content (of the user) in the operation in a personal computer; subsequently, combining the association relationship between the measuring information of the task model and the subject model, and finally finding out other resources which are most relevant to the current resource for the user when the user uses the resource, and recommending the other resources to the user, wherein the user does not need any extra operation in the whole operation process. The task model based on the operation history record and the subject model based on the resource content are automatically excavated, other resources relevant to the resource are automatically recommended as much as possible without any extra operation when the user uses the resources; and the invention aims at saving the time that the user spends in checking the file, so as to guarantee the consistency of user tasks as much as possible, and effectively alleviate the burden of the user to switch over the tasks.
Owner:PEKING UNIV

Cache management method of distributed internal memory column database

ActiveCN106294772AReduce calculations for repetitive tasksReduce query timeSpecial data processing applicationsExecution planDistributed memory
The invention discloses a cache management method of a distributed internal memory column database. The cache management method comprises the steps that cache queues are established on cache master control nodes; each physical task is used as a root node to cut the physical execution plan the node is located so as to obtain the cache calculation track corresponding to each physical task; cache feature trees are established on the cache master control nodes according to the cache calculation track corresponding to each physical task; when query requests arrive, an execution engine is queries to parse SOL statements into the physical execution plans; layer-level transversal is conducted on each node in the physical execution plans starting from the root nodes of the physical execution plan to execute, and whether the cache calculation track corresponding to each physical task is matched with the corresponding cache feature tree or not is judged; if yes, actual cache data of the physical tasks is directly read from the cache nodes, if not, the physical tasks are calculated. According to the cache management method of a distributed internal memory column database, weather a cache hits the target or not is rapidly detected through an efficient cache matching algorithm, and the query efficiency is improved.
Owner:UNIV OF ELECTRONIC SCI & TECH OF CHINA

Geographical-tag-oriented hot spot area event detection system applied to LBSN

The invention discloses a geographical-tag-oriented hot spot area event detection system applied to an LBSN, and belongs to the technical field of network data processing. The detection system is operated in the LBSN and is composed of a sign-in clustering module, an area calculation module based on tag clustering and a hot-spot area event calculation module. The sign-in clustering module is used for performing clustering processing on sign-in information to obtain a geographical area corresponding to the sign-in information, a geographical tag clustering algorithm is adopted by the area calculation module based on tag clustering to obtain an in-cluster area set from the geographical area corresponding to the sign-in information, the sign-in frequency in a time window is applied by the hot spot area event calculation module to obtain a hot spot area event from the in-cluster area set, and therefore the obtained hot spot area event is provided for a user. The cluster is utilized by the geographical-tag-oriented hot spot area event detection system to perform further clustering on points in the cluster within the smaller range, and the system has the advantages of being capable of greatly reducing the calculated data volume in the LBSN and improving the calculation efficiency.
Owner:北京中实信息技术有限公司

Method which is used for classifying translation manuscript in automatic fragmentation mode and based on large-scale term corpus

The invention provides a method which is used for classifying a translation manuscript in an automatic fragmentation mode and based on a large-scale term corpus. The method which is used for classifying the translation manuscript in an automatic fragmentation mode and based on the large-scale term corpus comprises that the translation manuscript is processed in a word classification mode, stop words are eliminated, a key word set is acquired, each key word of each paragraph of the translation manuscript is picked up, and corresponding relations of each paragraph and each key word included by the each paragraph are built; key words of the translation manuscript are one by one matched in the term corpus, and industry categorical attributes of terms matched by the key word are used as attributive industry categorical attributes of each paragraph corresponding to the key word; according to the corresponding relations, identical and maximum categorical attributes included by each paragraph are confirmed; and the paragraph is classified by the maximum categorical attributes. Because the number of words of the translation manuscript is far less than the number of words of the term corpus, the term corpus has the function of being looked up according to alphabet sequences and a pattern matching algorithm needs not adopting when key word matching is conducted in the term corpus, and therefore lookup time is greatly reduced, fragmentation time of the translation manuscript is shortened and fragmentation efficiency is improved.
Owner:IOL WUHAN INFORMATION TECH CO LTD

Method and device for inquiring data in database

The application discloses a method and a device for inquiring data in a database. The method comprises the following steps of receiving an inquiry instruction, determining the state of each block index in a block index collection which is preliminarily stored in a video memory of a GPU (Graphics Processing Unit), starting one or more GPU processes in the case that each block index in the video memory of the GPU is in an asynchronous state, and according to inquiry conditions, filtering the block indexes in the video memory of the GPU to obtain a first block index inquiry result; determining a final data inquiry result according to the first block index inquiry result. According to the method, a CPU (Central Processing Unit) preliminarily generates the block index collection corresponding to the database; as the volume of data of the block index collection is smaller than the volume of original data or partitioned data, all data of the block index collection can be copied and stored in a global memory of the GPU; when the block indexes in the global memory of the GPU are all in the asynchronous state, the inquiry is implemented by directly using the GPU processes; therefore the course of copying the partitioned data from the memory multiple times in the prior art is avoided; the inquiry time is shortened; the inquiry efficiency is improved.
Owner:HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products