Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target detection method and system for enhancing foreground and background distinction degree

A target detection and discrimination technology, applied in the field of target detection and deep learning, can solve the problems of false negatives, target pixels and non-target pixels, and insufficient distinction between foreground information and non-target background information, so as to improve the characteristics of The effect of proportion, enhanced discrimination, and enhanced overall feature expression ability

Pending Publication Date: 2022-02-18
SHANDONG SYNTHESIS ELECTRONICS TECH
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in the existing method, the feature extraction proportion of each pixel in the entire network is the same, and no distinction is made between the target pixel and the non-target pixel, resulting in the model including the foreground information of the target and the background information of the non-target Insufficient distinction, leading to false negatives

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target detection method and system for enhancing foreground and background distinction degree
  • Target detection method and system for enhancing foreground and background distinction degree

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0056] Such as figure 1 As shown, this embodiment provides an object detection method that enhances the discrimination between the foreground and the background. It includes terminals, servers and systems, and is realized through the interaction between terminals and servers. The server can be an independent physical server, or a server cluster or distributed system composed of multiple physical servers, or it can provide cloud services, cloud database, cloud computing, cloud function, cloud storage, network server, cloud communication, intermediate Cloud servers for basic cloud computing services such as software services, domain name services, security service CDN, and big data and artificial intelligence platforms. The terminal may be a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart watch, etc., but is not limited thereto. The terminal and the server may be connected directly or indirectly through wired or wireless communic...

Embodiment 2

[0126] This embodiment provides an object detection system that enhances the discrimination between the foreground and the background.

[0127] A target detection system for enhancing foreground and background discrimination, comprising:

[0128] A mark map generation module, which is configured to: generate a mark map with the same size as the original image according to the label information of the target in the original image;

[0129] The feature output module is configured to: use feature networks of different scales based on the original image to obtain feature outputs corresponding to feature networks of different scales;

[0130] An image adjustment module, which is configured to: introduce the mark map, adjust the size of the mark map, and obtain mark maps of different scales;

[0131] Mark map loss calculation module, which is configured to: calculate the mark map loss at different scales according to the feature output corresponding to the feature network of differ...

Embodiment 3

[0136] This embodiment provides a computer-readable storage medium on which a computer program is stored. When the program is executed by a processor, the steps in the object detection method for enhancing foreground-background discrimination as described in the first embodiment above are implemented.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the fields of deep learning, target detection and the like, and provides a target detection method and system for enhancing foreground and background discrimination. The method comprises the following steps: generating a Mark graph with the same size as an original image according to mark information of a target in the original image; based on the original image, adopting feature networks of different scales to obtain feature outputs corresponding to the feature networks of different scales; introducing the Mark map, and adjusting the size of the Mark map to obtain Mark maps with different scales; according to the feature output corresponding to the feature networks of different scales, and in combination with the Mark maps of different scales, calculating Mark map losses under different scales; optimizing a target detection model by adopting a loss function constructed by the Mark graph loss under different scales; and based on the original image, adopting the optimized target detection model to obtain a target detection result.

Description

technical field [0001] The invention belongs to the fields of deep learning, target detection, etc., and in particular relates to a target detection method and system for enhancing the discrimination between foreground and background. Background technique [0002] The statements in this section merely provide background information related to the present invention and do not necessarily constitute prior art. [0003] The existing patent "A Target Detection Method Based on Deep Learning" (patent application number: 202010187584.3, publication number: CN111401253A), the method creates a SCNN to filter the background of the input image, and generates a background with zero pixels and a background with 1 pixels Foreground, and perform target recognition and positioning on the image communication network with background removal. Although this method can solve the problems of long training time and large resource occupation of the existing deep learning technology, this method ne...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V20/20G06V10/46
Inventor 高朋刘辰飞陈英鹏张朝瑞席道亮刘明顺
Owner SHANDONG SYNTHESIS ELECTRONICS TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products