Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Quantifying objects on plant by estimating number of objects on plant portion, such as leaf, through convolutional neural network providing density map

A technology of convolutional neural network and density map, applied in the field of computer image processing

Pending Publication Date: 2022-05-10
BASF AG
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, many limitations arise

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Quantifying objects on plant by estimating number of objects on plant portion, such as leaf, through convolutional neural network providing density map
  • Quantifying objects on plant by estimating number of objects on plant portion, such as leaf, through convolutional neural network providing density map
  • Quantifying objects on plant by estimating number of objects on plant portion, such as leaf, through convolutional neural network providing density map

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049]writing convention

[0050] The specification begins by explaining some writing conventions.

[0051] The term "image" refers to the data structure of a digital photograph (ie, a data structure using file formats such as JPEG, TIFF, BMP, RAW, etc.). The phrase "taking an image" denotes the act of pointing a camera at an object, such as a plant or a part of a plant, and having the camera store the image.

[0052] This specification uses the term "show" when explaining the content (ie semantics) of an image, for example in a phrase such as "the image shows a plant". However, human users do not need to view the images. This computer-user interaction is expressed in the term "display", such as in "computer shows plant image to expert", where the expert user looks at the screen to see the plant on the image.

[0053] The term "annotation" represents metadata received by a computer when an expert user views a display of images and interacts with the computer. The term "ann...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Quantifying plant infestation is performed by estimating the number of biological objects (132) on a portion (122) of a plant (112). A computer (202) receives a plant image (412) captured from a specific plant (112). A computer (202) derives a portion image (422) showing a portion of a plant using a first convolutional neural network (262 / 272). The computer (202) segments the partial image into tiles and processes the tiles into a density map using a second network. The computer (202) combines the density maps into a combined density map at the size of the partial image and integrates pixel values into an estimated number of objects of the partial image. Object categories (132 (1), 132 (2)) may be distinguished to fine-tune quantization to identify category-specific countermeasures.

Description

technical field [0001] The present disclosure relates generally to computer image processing, and more particularly to techniques for quantifying objects on parts of plants (or simply "plant parts"). Even more specifically, the present disclosure relates to techniques for quantifying plant infestation by estimating the number of insects or other biological objects on plant leaves. Background technique [0002] As is well known, agricultural plants such as crops grow in an environment where they coexist with biological objects. These objects tend to be located at the plants, usually attached to them (at least temporarily); and the objects interact with the plants. [0003] Different objects tend to attach themselves to different parts of the plant. For example, some insects may sit on leaves, others may stick to stems or branches, and so on. [0004] From a farmer's perspective, object-plant interactions have two directions or aspects. In a first direction there are biolo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V20/60G06V10/26G06V10/764G06V10/56G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/082G06N3/045G06F18/2414G06N20/10G06V20/10G06V10/454G06V10/56G06V10/82G06V10/764G06N3/048G06V20/17G06T7/11G06V10/762G06T7/0012G06T2207/20081G06T2207/20084G06V20/188
Inventor A·阿尔瓦雷斯吉拉A·M·奥尔蒂斯巴雷多D·罗尔丹洛佩兹J·罗梅罗罗德里格斯C·M·斯潘格勒C·克鲁卡斯T·艾格斯J·埃查扎拉胡盖特R·纳瓦拉梅斯特A·皮肯鲁伊斯A·贝里亚图亚佩雷斯
Owner BASF AG
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products