Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Food volume estimation method based on double-view three-dimensional reconstruction

A three-dimensional reconstruction, dual-view technology, applied in computing, 3D modeling, image data processing, etc., can solve problems such as low accuracy and inconvenience

Inactive Publication Date: 2017-06-13
SHENZHEN WEITESHI TECH
View PDF0 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problems of inconvenient use and low accuracy, the purpose of the present invention is to provide a dual-vision based Figure three The food volume estimation method of three-dimensional reconstruction, firstly perform image 3D reconstruction, then perform external calibration through salient point matching, relative pose extraction and scale extraction, then perform dense reconstruction through image correction, stereo matching and point cloud generation, and finally use the segmentation map Remove the background to extract the food surface and calculate the food volume by integrating the height of the food surface above the pan

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Food volume estimation method based on double-view three-dimensional reconstruction
  • Food volume estimation method based on double-view three-dimensional reconstruction
  • Food volume estimation method based on double-view three-dimensional reconstruction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0034] figure 1 It is a dual-vision based on the present invention Figure three System flow diagram of the food volume estimation method for dimensional reconstruction. Mainly including image 3D reconstruction, external calibration, dense reconstruction and volume estimation.

[0035] Extrinsic calibration consists of three steps: salient point matching, relative pose extraction and scale extraction;

[0036] Dense reconstruction includes image correction, stereo matching and point cloud generation;

[0037] Volume estimation includes food surface extraction, dish surface extraction and volume calculation;

[0038] Food surface extraction uses a segmentation map to remove the backgr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a food volume estimation method based on double-view three-dimensional reconstruction. The method mainly comprises the steps of image 3D reconstruction, external calibration, dense reconstruction, and volume estimation. Particularly, the method comprises the steps of firstly performing image 3D reconstruction; performing external calibration through saliency point matching, relative gesture extraction and scale extraction; performing dense reconstruction through image correction, three-dimensional matching and point cloud generation; and finally eliminating background by means of a segmented graph and extracting a food surface, and calculating the volume of the food through integrating food surface height above a dish. According to the method of the invention, through acquiring the food image, the volume of the food is estimated through image reconstruction, thereby realizing convenient calculation process, high speed and accurate result. An accurate automatic meal evaluation tool is supplied for common groups and persons with specific nutrition requirements, and a novel full-automatic meal evaluation method for the people is realized.

Description

technical field [0001] The present invention relates to the field of image reconstruction, in particular to a dual-view Figure three A food volume estimation method for dimensional reconstruction. Background technique [0002] With the improvement of living standards, the prevalence of diet-related chronic diseases, such as obesity and diabetes, is gradually increasing worldwide, and the mortality rate is also on the rise. In addition to family genetic factors, the etiology of obesity and other diseases is mainly closely related to people's daily diet. By monitoring and analyzing the user's diet, it can help users obtain healthier eating habits. Traditionally, people need to roughly calculate the amount of food intake by recording their own dietary content and searching for relevant information, which is neither convenient nor accurate. The food volume estimation method proposed by the present invention can help users evaluate food intake through image reconstruction. [...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/10
CPCG06T17/10
Inventor 夏春秋
Owner SHENZHEN WEITESHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products