Object recognition method and apparatus based on weakly supervised learning

a weakly supervised learning and object recognition technology, applied in the field of object recognition methods and apparatuses based on weakly supervised learning, can solve the problems of difficulty even for an expert radiologist, limitations of human perception, excessive time and labor, etc., and achieve the effect of increasing the size of the feature map

Active Publication Date: 2018-05-24
LUNIT
View PDF15 Cites 64 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0016]According to another aspect of the inventive concept, there is provided a computer program coupled to a computing device and stored in a recording medium to execute an object recognition method based on weakly supervised learning, the computer program comprises an operation of training a CNN-based object recognition model using a training target image and an operation of recognizing an object of interest included in a recognition targe...

Problems solved by technology

However, it is very difficult even for an expert radiologist to identify the accurate location of a lesion in a patient's radiographic image and diagnose a disease that caused the lesion.
This arises from complicated causes such as th...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object recognition method and apparatus based on weakly supervised learning
  • Object recognition method and apparatus based on weakly supervised learning
  • Object recognition method and apparatus based on weakly supervised learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033]The present inventive concept will now be described more fully with reference to the accompanying drawings, in which exemplary embodiments of the inventive concept are shown. Advantages and features of the inventive concept and methods of accomplishing the same may be understood more readily by reference to the following detailed description of exemplary embodiments and the accompanying drawings. The inventive concept may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete and will fully convey the concept of the invention to those skilled in the art, and the inventive concept will only be defined by the appended claims. Like reference numerals refer to like components throughout the specification.

[0034]Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same mean...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Provided are an object recognition method and apparatus which determine an object of interest included in a recognition target image using a trained machine learning model and determine an area in which the object of interest is located in the recognition target image. The object recognition method based on weakly supervised learning, performed by an object recognition apparatus, includes extracting a plurality of feature maps from a training target image given classification results of objects of interest, generating an activation map for each of the objects of interest by accumulating the feature maps, calculating a representative value of each of the objects of interest by aggregating activation values included in a corresponding activation map, determining an error by comparing classification results determined using the representative value of each of the objects of interest with the given classification results and updating a CNN-based object recognition model by back-propagating the error.

Description

[0001]This application claims the benefit of Korean Patent Application No. 10-2016-0156035, filed on Nov. 22, 2016, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference in its entirety.BACKGROUND1. Field[0002]The present inventive concept relates to an object recognition method and apparatus based on weakly supervised learning, and more particularly, to an object recognition method and apparatus which determine an object of interest included in a recognition target image using a trained machine learning model and determine an area in which the object of interest is located in the recognition target image.2. Description of the Related Art[0003]Medical images are one of the important tools for diagnosing and treating patients in modern medicine. In particular, radiographic images of patients are being widely utilized to initially diagnose the patients because they can be acquired rapidly at low cost.[0004]However, it is very difficult...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/46G06T7/00G06N3/08G06N3/04G06F19/00G06V10/764
CPCG06K9/4671G06T7/0012G06N3/08G06N3/04G16H50/20G06T2207/10004G06T2207/20081G06T2207/20084G06T2207/30096G06F19/345G06T2207/30061G16H30/40G06V10/82G06V10/764G06N3/045G06N3/084G06N3/042G06F18/24G06F18/24143
Inventor KIM, HYO EUNHWANG, SANG HEUM
Owner LUNIT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products