Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for image feature deep learning and perceptibility quantification through computer

A deep learning and image feature technology, applied in computer parts, computing, instruments, etc., can solve problems such as poor overall performance, limitations, and long training time, and achieve a simple and convenient operation method, stable and reliable performance, and shortened training time. Effect

Inactive Publication Date: 2017-11-24
上海城诗信息科技有限公司
View PDF10 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Therefore, the existing technology will have the following defects: (1) It is necessary to train the model on a huge data set for a specific task, resulting in long pre-training time and poor overall performance when performing the task
(2) The computational complexity of CNN is high, which limits its use in application scenarios with high real-time requirements; (3) It is impossible to extract the depth and level features of images, so it is impossible to recognize the perception of pictures and perform large-scale Quantitative perception analysis of
Therefore, the applicability and practicability are limited, and it is difficult to meet the needs of the market

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for image feature deep learning and perceptibility quantification through computer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0021] figure 1 A specific embodiment of the invention is shown in which figure 1 It is a structural flow diagram of the present invention.

[0022] See figure 1 Shown, a kind of computer deep learning image feature and the method for quantifying perceptual degree, the method comprises the following steps:

[0023] Step A, collection of street view images;

[0024] Step B. Establish a scoring model, crowdsource and score some of the collected pictures for perception, and sort according to the size of perception and collect scoring data at the same time;

[0025] Step C, using the scoring data collected in step B to train the convolutional neural network classifier, and then use the trained convolutional neural network classifier to predict and score the remaining sensitivity of other pictures;

[0026] Step D. Predict the visual perception of the street on the map and give a score estimate of different perceptions of the street.

[0027] In step A, first divide the city a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for image feature deep learning and perceptibility quantification through a computer, and the method comprises the following steps: A, collecting street images; B, building a scoring model, carrying out the crowdsourcing scoring of the perceptibility of a part of collected street images, carrying out the sorting according to the value of perceptibility, and collecting the scoring data; C, carrying out the training of a convolution neural network classifier through the scoring data collected at step B, and carrying out the prediction and scoring of the perceptibility of the remaining images through the trained convolution neural network classifier; D, predicting the street visualization perceptibility on a map, and giving the score estimation of different perceptibility values of streets. The method is simple and convenient, can greatly shorten the training time in executing a task, guarantees the stable and reliable overall performances, can achieve the effective recognition of the image perceptibility and the large-scale quantitative perceptibility analysis, greatly enlarges the application range, and is good in application stability, applicability and practicality.

Description

technical field [0001] The invention relates to a method for computer deep learning of image features and quantification of perception. Background technique [0002] Image recognition and object detection are important research issues in the field of computer vision, and have broad application prospects in many aspects such as face recognition, security monitoring, and dynamic tracking. Image recognition refers to the technology of using computers to process, analyze and understand images to identify targets and objects in various patterns. Object detection refers to detecting and identifying a specific target in any frame or continuous frame images, and returning the position and size information of the target, such as outputting a bounding box surrounding the target. [0003] At present, convolutional neural network (CNN) has been widely used in image classification, object detection and other problems, without any human intervention, to extract information from images an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04
CPCG06N3/045G06F18/24
Inventor 周博磊张帆刘浏
Owner 上海城诗信息科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products