Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Caching of potential search results

A caching and caching group technology, applied in the direction of network data retrieval, other database retrieval, special data processing applications, etc., can solve problems such as low latency

Pending Publication Date: 2021-11-19
MICROSOFT TECH LICENSING LLC
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This ensures low latency, but can sometimes cause the result to switch positions before the user chooses

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Caching of potential search results
  • Caching of potential search results
  • Caching of potential search results

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] There are various scenarios where a user may wish to search for entity records from different data sources. It is desirable to be able to quickly display results to the end user (performance), while also ensuring that the correct entities are available for selection (integrity). Common solutions have pros and cons, such as caching (memory intensive) and streaming (CPU and bandwidth intensive). The normal way to execute queries is to evict the entire cache for each query to optimize the results for the query string. One approach is disclosed below, which is a compromise between these approaches, with segmented streams of different segments and / or sources, with frequent updates for the different segments. It can provide significant latency improvements for a variety of application scenarios, whether searching for people the user frequently works with or someone the user is interacting with for the first time.

[0016] User jobs can be expressed as follows. A user wants...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method comprising: receiving portions of a search query searching amongst multiple entries in a set of one or more data sources; triggered by the receipt of each respective portion, performing a respective caching operating comprising querying each of one or more data sources of the set to retrieve entries matching the search query as composed from the portion or portions received so far, and caching the retrieved entries in a corresponding cache group for each of the queried data sources; rendering an application view displaying a visual representation of at least some of the retrieved entries; and with each caching operation, selecting one or more of any newly retrieved entries in the cache following the retrieval based on the search query as now including the respective portion of the search query, and updating the application view to include the one or more selected entries from the cache.

Description

technical field [0001] The present disclosure relates to performing a search for potential results from one or more sources based on a user-entered search query, and to caching of potential results. Background technique [0002] Many computer systems and services include search facilities that enable a user to enter a search query to find targeted data records for a targeted entity, such as contact or profile information for a specific other person or group of people, or a desired product product information. The source of data being searched may be external to the user terminal from which the user performs the search. Therefore, it is known to cache some potential results from data sources locally at the user terminal prior to the search query. If the search query happens to match one of the cached records, this may enable results to be displayed to the user with lower latency than passively querying an external data source only after entering the search query. [0003] ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/957
CPCG06F16/9574G06F16/24552G06F16/248
Inventor F·F·尼古拉森J·维尔纳
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products