Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

56124 results about "Neural network nn" patented technology

Color printer characterization using optimization theory and neural networks

A color management method/apparatus generates image color matching and International Color Consortium (ICC) color printer profiles using a reduced number of color patch measurements. Color printer characterization, and the generation of ICC profiles usually require a large number of measured data points or color patches and complex interpolation techniques. This invention provides an optimization method/apparatus for performing LAB to CMYK color space conversion, gamut mapping, and gray component replacement. A gamut trained network architecture performs LAB to CMYK color space conversion to generate a color profile lookup table for a color printer, or alternatively, to directly control the color printer in accordance with the a plurality of color patches that accurately. represent the gamut of the color printer. More specifically, a feed forward neural network is trained using an ANSI/IT-8 basic data set consisting of 182 data points or color patches, or using a lesser number of data points such as 150 or 101 data points when redundant data points within linear regions of the 182 data point set are removed. A 5-to-7 neuron neural network architecture is preferred to perform the LAB to CMYK color space conversion as the profile lookup table is built, or as the printer is directly controlled. For each CMYK signal, an ink optimization criteria is applied, to thereby control ink parameters such as the total quantity of ink in each CMYK ink printed pixel, and/or to control the total quantity of black ink in each CMYK ink printed pixel.
Owner:UNIV OF COLORADO THE REGENTS OF

Data storage system with trained predictive cache management engine

In a data storage system, a cache is managed by a predictive cache management engine that evaluates cache contents and purges entries unlikely to receive sufficient future cache hits. The engine includes a single output back propagation neural network that is trained in response to various event triggers. Accesses to stored datasets are logged in a data access log; conversely, log entries are removed according to a predefined expiration criteria. In response to access of a cached dataset or expiration of its log entry, the cache management engine prepares training data. This is achieved by determining characteristics of the dataset at various past times between the time of the access/expiration and a time of last access, and providing these characteristics and the times of access as input to train the neural network. As another part of training, the cache management engine provides the neural network with output representing the expiration or access of the dataset. According to a predefined schedule, the cache management engine operates the trained neural network to generate scores for cached datasets, these scores ranking the datasets relative to each other. According to this or a different schedule, the cache management engine reviews the scores, identifies one or more datasets with the least scores, and purges the identified datasets from the cache.
Owner:IBM CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products