Nutrition management method and system based on deep learning food image recognition model

An image recognition and deep learning technology, applied in the field of meal food image data processing, can solve problems such as poor operability, errors, and inability to reflect dietary intake for a long time, and achieve the effect of supporting clinical cohort research

Pending Publication Date: 2022-05-27
CHONGQING UNIV OF POSTS & TELECOMM
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The weighing method requires weighing each food before and after meals, so as to obtain information on the type and amount of the food. Although this method is accurate, it is time-consuming, laborious, and inoperable, and is only suitable for small-sample research
Meal review relies on the subject to recall all the food names and portions consumed in a short period of time in the past, but the review time of this method should not be too...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Nutrition management method and system based on deep learning food image recognition model
  • Nutrition management method and system based on deep learning food image recognition model
  • Nutrition management method and system based on deep learning food image recognition model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 5

[0066] Embodiment 5 An important part of the present invention is the coding of the food image segmentation system. This embodiment further illustrates this.

[0067] Step 501: The segmentation method divides the image into each mask by detecting regions of interest (ROI); inputting the food image (matrix), binarizing the image, and extracting 3 values ​​(red, green, blue) of each pixel. ) or 1 value (black or white);

[0068] Step 502: Extract global features and local features, extract the outline type of the food image, detect the outline but do not establish a hierarchical relationship; save the outline information by a processing approximation method; for example, a matrix outline is stored with 4 points.

[0069] Step 503: Pass to the classifier and get feedback; pass the output mask to the classifier, let the classifier learn and feedback the food image category label in the data set, and get the final mask. The returned image of the food image segmentation is the mask...

Embodiment 6

[0070] Embodiment 6 This embodiment proposes a meal image segmentation modeling, which includes the following steps:

[0071] 601. Convolutional neural network based on image segmentation focusing mechanism. It enables the network to focus more on key areas and improves the ability to extract the distinguishable semantic features of the image.

[0072] 602. The weighting mechanism is introduced into the field of medium image recognition, and a pixel-level weighting mechanism based on food images, DenseNet, is proposed. In this DenseNet, each layer gets an extra input from all previous layers and passes the feature map of that layer to all subsequent layers. Food image DenseNet uses a cascaded approach, where each layer is receiving prior information from previous layers, improving the network's ability to extract distinguishable semantic features, thereby improving recognition accuracy.

[0073] 603. Use the image segmentation mechanism to complete deep learning, output the ...

Embodiment 7

[0074] Embodiment 7 This embodiment proposes a flow of an image recognition system, including the following steps:

[0075] 701. Convert the food into pictures by means of mobile phone cameras, etc., and then obtain data such as food type, volume, weight, and processing method through the data model, and put the obtained data into the learning model to further optimize the algorithm.

[0076] 702. Through the server and other data sets, it is obtained whether the ratio of energy, productivity and nutrients is in an appropriate range, and finally, the data obtained by the analysis is fed back to the user and corresponding dietary suggestions are given.

[0077] 703. Image segmentation, according to the ratio of the pixels occupied by each food in the picture to the pixels occupied by all foods, the proportion of the food in the whole set is calculated. By correlating with the relevant database (such as the Chinese food composition table), the nutrients contained in each food ar...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of dining food image data processing, and particularly relates to a nutrition management method and system based on a deep learning food image recognition model.The method comprises the steps that a user side obtains an image of food to be taken by a user, and the obtained food image is input into a trained deep learning-based food image recognition model; obtaining different types of food sub-images; calculating the amount of nutrients contained in the sub-images of the different types of food, and accumulating the nutrients in all the food to obtain the total nutrient intake of the user; setting intake thresholds of various nutrients, and comparing the calculated total intake of various nutrients with the corresponding nutrient intake thresholds to obtain a comparison result; according to a comparison result, the type and quantity of ingested food are adjusted, and nutrition management is completed; according to the method, the food intake information uploaded by the user is associated with other data sets through the server, whether the energy and productivity nutrient ratio is in the appropriate recommendation amount or not is obtained, and finally the data obtained through analysis is fed back to the user, so that the user is promoted to improve the diet mode.

Description

technical field [0001] The invention belongs to the field of meal food image data processing, and in particular relates to a nutrition management method and system based on a deep learning food image recognition model. Background technique [0002] With the improvement of living standards, people pay more and more attention to their health, and the health of the body is closely related to the food that the human body consumes every day. Therefore, the rationality of the daily diet plays an important role in the health of the body, and how to judge the reasonableness of the diet. The key to sexuality is accurate estimation of the type and amount of food ingested. Commonly used tools for obtaining dietary intake information include weighing method, dietary review and food frequency questionnaire (FFQ). The weighing method requires weighing each food before and after meals to obtain information on the type and portion of food. Although this method is accurate, it is time-consu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06V20/20G06V10/26G06V10/20G06V10/80G06K9/62G16H20/60
CPCG16H20/60G06F18/253
Inventor 余海燕徐仁应余江朱珊唐成心苏星宇张胜翔
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products